Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-laser radar and multi-camera sensor spatial position automatic calibration method

A technology of laser radar and spatial location, which is applied in the directions of instruments, image analysis, calculation, etc., can solve the problems that the reliability of the calibration results cannot be guaranteed, and achieve the effect of improving the accuracy of automatic calibration, accurate calibration results, and reducing the overall error

Pending Publication Date: 2022-06-03
苏州驾驶宝智能科技有限公司
View PDF15 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In practical application scenarios using multiple sets of cameras and lidar data, single-camera radar fusion technology cannot guarantee the reliability of calibration results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-laser radar and multi-camera sensor spatial position automatic calibration method
  • Multi-laser radar and multi-camera sensor spatial position automatic calibration method
  • Multi-laser radar and multi-camera sensor spatial position automatic calibration method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] like figure 1 As shown, Embodiment 1 of the present invention proposes a method for automatically calibrating the spatial positions of multiple lidar and multi-camera sensors. figure 2 Set, but not limited to the above number. The method includes:

[0063] Obtain the spatial position of the radar relative to the camera sensor, the spatial position between the two cameras, and the spatial position between the two lidars, and obtain the spatial position relationship between multiple sets of lidars and camera sensors;

[0064] As shown in Figure 3(a) and Figure 3(b), the images collected by two cameras and the point cloud data collected by two lidars are respectively.

[0065] For the spatial position relationship between lidar and camera, filter the data that conforms to the line feature from the lidar point cloud data; filter the data conforming to the line feature from the camera sensor image data; project the lidar data conforming to the line feature to the camera s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an automatic calibration method for spatial positions of multiple laser radars and multiple camera sensors, and the method comprises the steps: 1), extracting linear features of point cloud data collected by each laser radar and RGB images collected by each camera sensor, calculating the score of a gray value through projection, obtaining the total score of all laser radar points, and obtaining the total score of all laser radar points; selecting the highest total score by adopting a self-adaptive optimization method so as to obtain a position relation between each laser radar and a certain camera sensor; 2) registering the point cloud data of every two laser radars to obtain a position relationship between the laser radars; 3) obtaining a position relationship among camera sensors according to the camera sensing image data by using epipolar geometric constraints of the plurality of cameras; and 4) performing global optimization according to the position relations obtained in the step 1), the step 2) and the step 3), and completing calibration of the spatial positions of the multiple laser radars and the multiple camera sensors.

Description

technical field [0001] The invention relates to the field of multi-sensor calibration, in particular to an automatic calibration method for spatial positions of multi-laser radar and multi-camera sensors. Background technique [0002] Information fusion is a key technology for autonomous driving safety, and multi-modal perception plays a key role in achieving robust environmental perception for unmanned driving. Multi-modal fusion is a prerequisite for robust perception, and a single sensor cannot cope with all-weather scenarios. There are inherent defects in hardware materials; multi-sensor data modalities have problems such as data heterogeneity, modal differences, and unbalanced sampling, and special fusion models need to be designed to make use of them; while the current major automakers’ autonomous vehicle sensors There are basically lidars and cameras in the configuration scheme of the system. Multi-modal fusion is an important way to improve the accuracy, and reliable...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S7/497G06T7/80
CPCG01S7/497G06T7/80
Inventor 张新钰王继鹏鲍泽峰熊一瑾高涵文郭世纯
Owner 苏州驾驶宝智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products