Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Map construction method, device and system based on laser radar assisted vision

A technology of laser radar and map construction, which is applied in the computer field to achieve the effect of improving positioning accuracy, improving accuracy, and avoiding small differences

Pending Publication Date:
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technical problem to be solved by the present invention is to provide a map construction method based on laser radar-assisted vision for the above-mentioned deficiencies in the prior art, which can integrate laser radar positioning and machine vision positioning, and make up for each other. The defects in the single positioning technology, Improve positioning accuracy and mapping effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Map construction method, device and system based on laser radar assisted vision
  • Map construction method, device and system based on laser radar assisted vision
  • Map construction method, device and system based on laser radar assisted vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] Embodiment one, as figure 1 As shown, a method of map construction based on lidar-assisted vision includes the following steps:

[0031] S1: Calibrate the lidar and visual camera that observe the same observation object, and obtain the calibration conversion parameters between the radar coordinate system and the camera coordinate system;

[0032] S2: Obtain the radar point cloud data sequence collected by the lidar, and use the ICP point cloud matching method to match and solve the radar point cloud data sequence according to the calibration conversion parameters, and obtain the radar point cloud data sequence corresponding to sequence of intermediate conversion parameters;

[0033] S3: Obtain the camera image data sequence collected by the visual camera, perform triangulation processing on the camera image data sequence according to the intermediate conversion parameter sequence, and obtain an optimized image feature sequence;

[0034] S4: Optimizing the intermediate...

Embodiment 2

[0087] Embodiment two, such as figure 2 As shown, a map construction device based on lidar-assisted vision adopts the map construction method based on lidar-assisted vision in Embodiment 1, including a calibration module, an acquisition module, a matching module, a processing module, an optimization module and a mapping module. module;

[0088] The calibration module is used to calibrate the laser radar and the visual camera for observing the same observation object, so as to obtain the calibration conversion parameters between the radar coordinate system and the camera coordinate system;

[0089] The acquiring module is used to acquire the radar point cloud data sequence collected by the lidar and the camera image data sequence collected by the visual camera;

[0090] The matching module is used to use the ICP point cloud matching method to match and solve the radar point cloud data sequence according to the calibration conversion parameters, and obtain the intermediate con...

Embodiment 3

[0134] Embodiment three, as image 3 As shown, a map construction system based on laser radar-assisted vision includes a laser radar, a visual camera, and the map construction device in Embodiment 2. The laser radar and the visual camera are fixed together and are used to observe the same observing objects; both the lidar and the vision camera are connected in communication with the map construction device;

[0135] The lidar is used to collect the radar point cloud data sequence of the observed object and output it to the map construction device;

[0136] The visual camera is used to collect the camera image data sequence of the observed object and output it to the map construction device.

[0137] The map construction system based on laser radar-assisted vision in this embodiment realizes high-precision, real-time positioning and mapping, and can integrate laser radar positioning and machine vision positioning. Reduce the impact of the environment on visual positioning, ma...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a map construction method, device and system based on laser radar assisted vision, and the method comprises the steps: calibrating a laser radar and a vision camera, and obtaining a calibration conversion parameter between a radar coordinate system and a camera coordinate system; acquiring a radar point cloud data sequence, and performing matching solution on the radar point cloud data sequence by adopting an ICP point cloud matching method to obtain an intermediate conversion parameter sequence; obtaining a camera image data sequence, and performing triangularization processing on the camera image data sequence according to the intermediate conversion parameter sequence to obtain an optimized image feature sequence; optimizing the intermediate conversion parameter sequence according to the radar point cloud data sequence and the optimized image feature sequence by adopting a least square method to obtain a target conversion parameter sequence; and performing positioning and mapping according to the target conversion parameter sequence and the camera image data sequence to complete map construction. According to the method, the defect of singly using visual positioning is overcome, and the positioning accuracy and the mapping effect are effectively improved.

Description

technical field [0001] The present invention relates to the field of computer technology, in particular to a method, device and system for constructing a map based on laser radar-assisted vision. Background technique [0002] In the operation process of modern unmanned vehicles or robots, synchronous positioning and map construction are an indispensable key part, because in order to realize automatic driving or automatic navigation, unmanned vehicles or robots must first obtain precise positioning to be able to accurately to navigate. [0003] There are many existing positioning technologies, such as single-point GPS positioning, differential GPS positioning, lidar positioning, and machine vision-based positioning. However, each positioning technology has its own disadvantages, for example: single-point GPS positioning technology needs to rely on the number of satellites to ensure the quality of positioning, strong limitations, and high cost; differential GPS positioning te...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/05G06T17/20
CPCG06T17/05G06T17/20
Inventor 张扬吉陈志佳
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products