Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Joint Calibration Method of Multi-line LiDAR and Camera Based on Refined Radar Scanning Edge Points

A laser radar and joint calibration technology, which is applied in radio wave measurement systems, image analysis, image enhancement, etc., can solve the problem of inaccurate edge calibration points, and achieve the effect of avoiding insufficient precision and accurate calibration results

Active Publication Date: 2021-08-03
ZHEJIANG UNIV OF TECH
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the problem of inaccurate acquisition of edge calibration points by existing laser radars, the present invention provides a method for joint calibration of laser radar and cameras based on refined radar scanning edge points to improve the accuracy of edge calibration points.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Joint Calibration Method of Multi-line LiDAR and Camera Based on Refined Radar Scanning Edge Points
  • A Joint Calibration Method of Multi-line LiDAR and Camera Based on Refined Radar Scanning Edge Points
  • A Joint Calibration Method of Multi-line LiDAR and Camera Based on Refined Radar Scanning Edge Points

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The present invention will be further described below in conjunction with the accompanying drawings.

[0044] refer to Figure 1 to Figure 4 , a multi-line lidar and camera joint calibration method for refining radar scanning edge points, including the following steps:

[0045] 1) Perform internal reference calibration on the camera to obtain the internal reference matrix of the camera

[0046]

[0047] Where f is the focal length of the camera, [O x , O y ] at the principal optical axis point, such as figure 1 As shown in , design a calibration object with spatial geometric characteristics, on which there are four hollow circles with the same radius r and the same distance l from the center of the circle. Place it where the camera and lidar can get it at the same time, such as figure 2 As shown, the positions of the lidar and the camera are relatively fixed, the distance from the calibration board to the lidar is L, and the distance between the background wall...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A multi-line lidar and camera joint calibration method based on fine radar scanning edge points mainly involves technical fields such as robot vision and multi-sensor fusion. Due to the influence of lidar resolution, the extracted scanning edge points are often not accurate enough, resulting in inaccurate calibration results. According to the characteristics of sudden changes in the distance of the laser radar point at the edge, the present invention searches and compares multiple times, and takes a point closer to the edge as a calibration point. The translation between the camera and lidar is calculated according to the pinhole camera model by detecting circles in the camera image and radar edge points. The calibration parameter C is searched in the neighborhood space of the resulting translation vector to find the calibration result that minimizes the projection error. The invention can extract the points scanned by the laser radar on the edge of the object with high precision, avoids the problem of insufficient precision caused by the low resolution of the laser radar, and thus improves the calibration accuracy.

Description

technical field [0001] The invention relates to the technical fields of robot vision, multi-sensor fusion, etc., in particular to a multi-line laser radar and camera joint calibration method oriented to point cloud edge refinement. Background technique [0002] The fusion of lidar and camera is widely used in 3D reconstruction in robot vision, autonomous navigation and positioning, and drones. A single sensor has limitations, such as cameras are susceptible to light and blurred external environments, and lidar data points are sparse, and the fusion of the two can make up for their respective shortcomings. [0003] In order to fuse the information acquired by lidar and camera, joint calibration between the two is essential. The mutual conversion relationship between the two sensor space coordinate systems is determined through calibration, so that the information obtained by different sensors can be fused into a unified coordinate system. At present, most of the joint calib...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/80G06T7/136G06T7/194G06T7/13G06T7/66G06T7/60G01S7/497
CPCG01S7/497G06T7/60G06T2207/10028G06T2207/10044G06T2207/20061G06T2207/30208G06T7/13G06T7/136G06T7/194G06T7/66G06T7/80
Inventor 张剑华冯宇婷王曾媛吴佳鑫林瑞豪陈胜勇
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products