Mobile robot localization method and system based on 3D point cloud and vision fusion

A mobile robot, three-dimensional point cloud technology, applied in radio wave measurement systems, instruments, electromagnetic wave re-radiation and other directions, can solve problems such as difficult positioning, and achieve the effect of avoiding sparse outdoor feature scenes

Active Publication Date: 2022-07-15
SHANGHAI JIAOTONG UNIV
View PDF9 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the existing point cloud map solutions need to store a large amount of point cloud data and match the real-time point cloud with the entire map.
However, visual SLAM construction maps are difficult to achieve accurate positioning in scenes with sparse outdoor features.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile robot localization method and system based on 3D point cloud and vision fusion
  • Mobile robot localization method and system based on 3D point cloud and vision fusion
  • Mobile robot localization method and system based on 3D point cloud and vision fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The present invention will be described in detail below with reference to specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that, for those skilled in the art, several changes and improvements can be made without departing from the inventive concept. These all belong to the protection scope of the present invention.

[0056] In view of the defects in the prior art, the present invention provides a method combining laser point cloud features and image features to form a joint feature, and matching with a feature grid map for positioning, respectively involving the establishment of an environment map, point cloud and visual feature fusion, and matching maps , wherein the environment map is to establish a feature grid map of the environment, and each grid in the feature grid map stores a point set composed of feature points extr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a mobile robot positioning method and system based on three-dimensional point cloud and vision fusion. Among them, the environment map is to establish a feature grid map of the environment. Each grid in the feature grid map stores a point set composed of feature points extracted from point clouds and visual image feature points, and extracts height values ​​and intensity values. , normal vector projection value; the fusion of point cloud and visual feature is to project the feature points extracted from the image into the point cloud space, and form a joint feature point with the point cloud feature; matching the map and positioning is to project the joint feature point to a two-dimensional grid. , and extract the feature vector, match the feature grid with the map, use the histogram filter to determine the posterior probability of each candidate pose, and determine the position of the robot in the map based on each posterior probability.

Description

technical field [0001] The invention relates to the technical field of mobile robot positioning and navigation, in particular, to a mobile robot positioning method and system based on three-dimensional point cloud and vision fusion, in particular to a feature grid map matching positioning method for mobile robots based on multi-sensor fusion. Background technique [0002] Mobile robots usually have the function of autonomous positioning and navigation. They need to complete the construction of environmental maps and achieve high-precision positioning based on the constructed maps. The positioning problem is a key problem in the field of robotics. [0003] Positioning systems play a pivotal role in autonomous vehicles. Other modules, such as perception, path planning and other modules, perform corresponding operations based on the positioning results generated by the positioning system to varying degrees. The accuracy of positioning is one of the keys that directly affects t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/05G06T19/00G06T7/73G06T3/00G01S17/89
CPCG06T17/05G06T3/005G06T7/73G01S17/89G06T19/006G06T2207/10028
Inventor 王贺升赵小文
Owner SHANGHAI JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products