Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep learning-based complex road condition perception system and apparatus

A technology of complex road conditions and deep learning, applied in the fields of instruments, character and pattern recognition, computer parts, etc., can solve the problem that cannot achieve practical value, different features cannot be shared in different classifiers, and cannot improve the efficiency of detection, etc. question

Pending Publication Date: 2018-03-27
苏州天瞳威视电子科技有限公司 +1
View PDF2 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Traditional methods can only perform single-target detection. If multi-target detection is required, different features and different classifiers need to be used, which increases the design difficulty of the entire system, and different features cannot be shared among different classifiers, resulting in It reduces the repeatability of calculation and cannot improve the efficiency of detection, and the generalization ability of traditional algorithms in complex scenarios is weak and cannot achieve practical value.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning-based complex road condition perception system and apparatus
  • Deep learning-based complex road condition perception system and apparatus
  • Deep learning-based complex road condition perception system and apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] A complex road condition perception system based on deep learning. This system includes five steps: the first step is to collect image sensor data; the second step is to preprocess the image data, including white balance, gamma correction, and denoising processing; the third step is to use The first deep neural network is used for feature extraction; the fourth step is to use the features extracted by the second deep neural network for target positioning; the fifth step is to use the third deep neural network for target recognition in the positioning area to obtain the final detection result .

Embodiment 2

[0027] In the complex road condition perception system based on deep learning described in embodiment 1, the third step is to perform feature multiplexing; the fourth step is to separate the foreground and the background; the third step, the In the fourth step, the network used in the fifth step is integrated into a deeper deep neural network, wherein the network used in each step is processed as a sub-network of the entire network.

Embodiment 3

[0029] In the complex road condition perception system based on deep learning described in Embodiment 2, the sub-network is a multi-scale neural network, which performs feature extraction on different scales, and finally uses a special sub-network to extract features at different scales. Integration is performed to obtain the final multi-scale features.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep learning-based complex road condition perception system and apparatus. The deep learning-based complex road condition perception system comprises the five steps of 1, collecting data of an image sensor; 2, preprocessing the image data, wherein the preprocessing comprises white balance, gamma correction and denoising processing; 3, performing feature extraction by using a first deep neural network; 4, performing target locating by using features extracted by a second deep neural network; and 5, performing target identification on a locating region by using a thirddeep neural network to obtain a final detection result. The apparatus is used for the deep learning-based complex road condition perception system.

Description

Technical field: [0001] The invention relates to a complex road condition perception system and device based on deep learning. Background technique: [0002] Using deep learning technology, integrate feature extraction, target positioning, target recognition and other technologies into a unified deep neural network, and form a complete set of end-to-end solution prevention, and optimize the model complexity of the deep neural network on the vehicle equipment Based on performance, it can quickly and accurately process vision tasks in vehicle systems. [0003] Traditional methods can only perform single-target detection. If multi-target detection is required, different features and different classifiers need to be used, which increases the design difficulty of the entire system, and different features cannot be shared among different classifiers, resulting in It reduces the repeatability of calculation and cannot improve the efficiency of detection, and the generalization abi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/56G06V2201/07G06F18/24G06F18/253
Inventor 赵贵平王曦宝鹤鹏温泉李宗南
Owner 苏州天瞳威视电子科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products