Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vehicle radar data and camera data fusion method and system

A radar data and fusion method technology, applied in the field of data fusion, can solve problems such as poor ranging accuracy, easily affected by light, and non-semantic measurement data, and achieve the effects of low performance requirements, reduced computing power requirements, and improved efficiency

Active Publication Date: 2021-08-31
WUHAN KOTEI INFORMATICS
View PDF12 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Millimeter-wave radar has accurate ranging and is not affected by environmental factors such as light, but the measurement data has no semantics. Monocular cameras can accurately identify targets such as pedestrians and vehicles, but the ranging accuracy is poor and is easily affected by light.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vehicle radar data and camera data fusion method and system
  • Vehicle radar data and camera data fusion method and system
  • Vehicle radar data and camera data fusion method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0022] figure 1 A flow chart of a fusion method for vehicle radar data and camera data provided by an embodiment of the present invention, as shown in figure 1 As shown, the method includes: 101. Receive radar data and camera data for detecting a target, and perform time-space registration on the radar data and the camera data; 102. Match the target detected by the radar with the target detected by the camera, Obtain radar data and camera data of the same target; 103. Use the longitudinal distance and longitudinal relative velocity of the target in the radar data as the longitudinal distance and longitudinal velocity of the fused target, and use the lateral distance and l...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

According to the vehicle radar data and camera data fusion method and system provided by the invention, when the fusion data of the target is acquired, the radar data and the target data of the target are fused without adopting a complex algorithm; instead, a longitudinal distance and the longitudinal relative speed of the target in the radar data are directly used as the longitudinal distance and the longitudinal speed of the fused target, and the transverse distance and the transverse relative speed in the camera data are used as the transverse distance and the transverse speed of the fused target, so the computing power requirement in the fusion process is reduced; and the performance requirement on hardware is relatively low, and the target data fusion efficiency is improved.

Description

technical field [0001] The invention relates to the field of data fusion, and more specifically, to a method and system for fusion of vehicle radar data and camera data. Background technique [0002] Advanced Driver Assistance Systems (ADAS) perceive the surrounding environment of the vehicle through sensors such as millimeter-wave radar and monocular cameras, and control the vehicle according to the perception results to achieve the purpose of improving driving experience and safety. Millimeter-wave radar has accurate ranging and is not affected by environmental factors such as light, but the measurement data has no semantics. Monocular cameras can accurately identify targets such as pedestrians and vehicles, but the ranging accuracy is poor and is easily affected by light. The fusion algorithm can effectively make up for the shortcomings of the two, give full play to the expertise of the two, and improve the accuracy of perception and anti-interference ability. Contents ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01S13/86
CPCG01S13/867
Inventor 程德心张家豪张伟王伟华
Owner WUHAN KOTEI INFORMATICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products