Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fusion correction method of stereo vision and low-beam lidar in unmanned driving

A lidar and stereo vision technology, applied in radio wave measurement systems, electromagnetic wave re-radiation, image enhancement and other directions, can solve the problems of inability to guarantee pixels, complex changes, and inability to directly perceive and process the surrounding environment, to reduce costs, Improve visual accuracy and improve the effect of binocular vision accuracy

Active Publication Date: 2020-07-14
武汉环宇智行科技有限公司
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Low-beam laser radar is an important sensor for unmanned driving environment perception. It is suitable for complex traffic environment perception. The working principle of laser radar is to emit laser beams to the target. After the laser beam touches the target, it is reflected and received by the system. Its advantages The price is relatively high, and the beam lidar is relatively cheap, and the obtained 3D point cloud has high accuracy; its disadvantage is that the 3D point cloud is too sparse to be directly used for the surrounding environment perception processing
As an important branch of computer vision, binocular stereo vision is widely used in technologies such as unmanned automobiles, autonomous navigation of drones, and autonomous land vehicles for moon landing. The method of obtaining three-dimensional information by calculating the position deviation between the corresponding points of the two images of the image and calculating the position deviation between the corresponding points of the image. Environmental factors have a great influence, which leads to the low accuracy of the obtained 3D model, and its price is too high, which is not widely recognized for use
According to the triangulation principle adopted by binocular stereo vision: Among them, Z represents the distance between the monocular camera and the object in front, B represents the baseline, f represents the focal length, and d represents the parallax. The precise values ​​of the baseline B and the focal length f can be determined by calibration, then the main error of Z comes from the parallax d, the parallax d is mainly obtained by the pixel matching algorithm of the left and right images of the stereo vision, but due to the complex changes in the lighting conditions of the real environment and other factors, it is impossible to guarantee that each pixel can be matched accurately, and when the measured object is far away, that is, when the real Z is large, the parallax A small deviation of d will bring a huge error in the real measured value Z

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fusion correction method of stereo vision and low-beam lidar in unmanned driving
  • Fusion correction method of stereo vision and low-beam lidar in unmanned driving

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] The invention will be further described below with reference to the accompanying drawings and in combination with specific embodiments, so that those skilled in the art can implement it by referring to the description, and the protection scope of the present invention is not limited to the specific embodiments.

[0017] The invention relates to a method for fusion correction of stereo vision and low beam laser radar in unmanned driving. The method includes the following steps:

[0018] (1), register the binocular camera and the laser radar in the airspace and timing, the binocular camera is aimed at the target to collect images, and the laser radar emits a wire beam towards the target and collects data; the airspace registration refers to the binocular The location of the camera is correspondingly matched with the location of the lidar, and the timing registration refers to the simultaneous acquisition of images by the binocular camera and the emission of the lidar beam;...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a fusion correction method of stereo vision and low-beam lidar in unmanned driving. The method is to register binocular cameras and lidar in airspace and time sequence, and the binocular cameras are aligned with the target to collect images, while the laser The radar emits a line beam toward the target and collects data. By converting the lidar data into a parallax map, it is used to correct the parallax error of binocular vision. The error compensation function is obtained based on the distribution of the parallax error. Subsequently, only the binocular vision needs to be The collected disparity map is substituted into the compensation function, and the corrected disparity map with minimal disparity error can be obtained. By correcting the disparity of binocular vision, the visual accuracy is improved, and then the disparity map after semantic segmentation and the compensated disparity map are obtained. The full-pixel disparity map is input to the deep learning network as input, and finally the disparity map trained by the deep learning network is obtained, which can further improve the binocular vision accuracy in driverless driving.

Description

technical field [0001] The invention relates to the technical field of unmanned driving, in particular to a fusion correction method of stereo vision and low wire beam lidar in unmanned driving. Background technique [0002] Low-beam laser radar is an important sensor for unmanned driving environment perception. It is suitable for complex traffic environment perception. The working principle of laser radar is to emit laser beams to the target. After the laser beam touches the target, it is reflected and received by the system. Its advantages The price is relatively high, and the beam lidar is relatively cheap, and the obtained 3D point cloud has high accuracy; its disadvantage is that the 3D point cloud is too sparse to be directly used for the surrounding environment perception processing. As an important branch of computer vision, binocular stereo vision is widely used in technologies such as unmanned automobiles, autonomous navigation of drones, and autonomous land vehicl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/00G06T5/50G01S17/86G01S17/88
CPCG06T5/50G01S17/88G06T2207/10044G06T2207/20228G06T2207/30252G06T2207/20081G01S17/86G06T5/80
Inventor 李明于欢肖衍佳
Owner 武汉环宇智行科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products