Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A joint calibration method of a 3D lidar and a monocular camera

A monocular camera and three-dimensional laser technology, applied in the field of information fusion, can solve problems such as large errors, difficult to accurately extract spatial matching feature points, and complex schemes

Inactive Publication Date: 2016-12-14
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF2 Cites 58 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing joint calibration schemes of 3D lidar and monocular camera are relatively complicated, and in some schemes, it is difficult to accurately extract the spatial matching feature points, resulting in large errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A joint calibration method of a 3D lidar and a monocular camera
  • A joint calibration method of a 3D lidar and a monocular camera
  • A joint calibration method of a 3D lidar and a monocular camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The present invention will be described in detail below in conjunction with the accompanying drawings and embodiments.

[0051] The main steps of a joint calibration method of a three-dimensional laser radar and a monocular camera are as follows:

[0052] Step 1: Establish the mathematical model of the coordinate coefficient of the monocular camera and calibrate the internal and external parameters of the camera;

[0053] The monocular camera coordinate coefficient mathematical model adopted in this method is a pinhole approximation model, such as figure 1 shown. First, Zhang Zhengyou’s monocular camera calibration method is used to calibrate the internal parameters of the camera, and the internal parameter matrix M of the camera’s effective focal length f, image principal point coordinates (u0, v0), scale factors fx and fy, and each position is obtained. External parameters such as the orthogonal rotation matrix R and translation vector T of . Let the camera coordin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a joint calibration method of a 3D lidar and a monocular camera, and belongs to the field of information fusion. The invention aims at improving the calibration efficiency under the requirement of ensuring the joint calibration accuracy of multi-sensor information fusion. Firstly, the monocular camera is calibrated, and then a normal vector of a target calibration plate in a monocular camera coordinate system, the distance from the target calibration plate to the origin in the monocular camera coordinate system, a normal vector of a fitting plane in a 3D lidar coordinate system and the distance from the origin to the fitting plane in the 3D lidar coordinate system are respectively solved at each position. Then, the results of the joint calibration are solved by using the distance correspondence principle. Finally, the result of the joint calibration is taken as the initial value into an optimization objective function to obtain the optimal joint calibration result. The joint calibration method of the invention greatly reduces human participation, reduces random errors caused by complicated man-made operation, simplifies the process of multi-sensor joint calibration and improves the precision of the calibration.

Description

technical field [0001] The invention relates to a joint calibration method of a three-dimensional laser radar and a monocular camera, in particular to a multi-sensor data fusion technology in the field of environmental perception of an unmanned motion platform, and belongs to the field of information fusion. Background technique [0002] For the perception system in the unmanned motion platform environment, the use of multi-sensor data fusion technology can reduce the impact of sudden changes in the environment on system performance and improve the robustness of the system; using multi-sensor information fusion can obtain more information than a single sensor. High resolution and can improve system reliability and fault tolerance. The premise of information fusion between multiple sensors is joint calibration. Although the visual sensor obtains rich information, it is particularly susceptible to the influence of external weather, light and other factors, and lacks the three...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
Inventor 李静于刘志王军政汪首坤赵江波沈伟马立玲
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products