Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A bridge vehicle wheel detection method based on a multilayer feature fusion neural network model

A neural network model and feature fusion technology, applied in the field of bridge vehicle wheel detection based on deep learning, can solve the problems of reduced classification task accuracy, final algorithm performance impact, and increased network burden.

Active Publication Date: 2019-06-14
TONGJI UNIV
View PDF17 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, using the vehicle attitude classification task as an auxiliary task of the vehicle detection task will add an additional burden to the network, and when there are multiple vehicle targets in the image, the accuracy of the classification task will be reduced, which will have a great impact on the final algorithm performance
[0009] At present, there is no better method for detecting vehicle wheels, and there is no vehicle wheel detection method that can complete real-world scenarios.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A bridge vehicle wheel detection method based on a multilayer feature fusion neural network model
  • A bridge vehicle wheel detection method based on a multilayer feature fusion neural network model
  • A bridge vehicle wheel detection method based on a multilayer feature fusion neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0085] In order to make the object, technical scheme and advantages of the present invention clearer, below in conjunction with embodiment, specifically as figure 1 The shown algorithm flow chart further describes the present invention in detail. It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention.

[0086] Step 1: Construct a deep learning neural network model based on multi-layer feature fusion. The specific description is as follows. The construction of a multi-layer feature fusion neural network model consists of a feature extraction module and a multi-layer feature fusion module, which are used to extract a feature from the image to be detected. Series of feature maps of different sizes. The deep learning neural network model based on multi-layer feature fusion adds a multi-layer feature fusion module on the basis of the feature extraction module, which integrates the ...

specific Embodiment approach

[0163] figure 1 It is a flow chart for realizing the method of the present invention, and the specific embodiments are as follows:

[0164] 1. Build a feature extraction module;

[0165] 2. Build a multi-layer feature fusion module;

[0166] 3. Build a multi-task loss function;

[0167] 4. Adjust the image size of all training sets to 300*300;

[0168] 5. The initial learning rate of training is set to 0.001, and the number of iterations is set to 10w times. After 6w iterations, the learning rate is reduced to 10 -4 , after 8w iterations, the learning rate is reduced to 10 -5 .

[0169] 6. Repeatedly input the training image for model training, calculate the loss value according to the training loss function, and use the stochastic gradient descent (SGD) algorithm to adjust the model parameters until the number of training iterations reaches the set value;

[0170] 7. Perform image enhancement preprocessing on the image to be detected;

[0171] 8. Adjust the size of the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a bridge vehicle wheel detection method based on a multilayer feature fusion neural network model. The method comprises the following steps of 1) constructing a deep learningneural network model based on multilayer feature fusion; 2) training the model by using the training sample data set; 3) carrying out image enhancement preprocessing operation on the to-be-detected image shot on the bridge; 4) inputting the preprocessed image into the model to obtain an output image with wheel vehicle category coordinate calibration; and 5) matching the detected wheels in the image with the corresponding vehicles by using an overlap rate measurement method. Compared with the prior art, the method has the advantages that the detection precision is improved, the real-time performance is enhanced, the probability of missed detection is reduced and the like, the wheel detection is realized while vehicle target detection is realized, and the vehicles on the same image are automatically matched with the wheels.

Description

technical field [0001] The invention relates to the field of monitoring video intelligent analysis and the field of bridge external load monitoring, in particular to a method for detecting bridge vehicle wheels based on deep learning. Background technique [0002] Vehicle wheel detection belongs to the object detection task. The task of target detection is to find out all the targets (objects) of interest in the image, which is one of the core issues in the field of machine vision. Object detection is not only about locating the location of the object but also knowing what the object is. For the computer, facing the RGB pixel matrix, it is difficult to directly obtain abstract concepts such as cars and boats from the image and locate their positions, and the object posture, illumination and complex background are mixed together, making object detection more difficult. difficulty. In the present invention, the objects of interest are vehicles and wheels. [0003] Target d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCY02T10/40
Inventor 赵才荣傅佳悦夏烨
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products