Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Slam mapping method and system based on multi-sensor fusion

A multi-sensor fusion and sensor technology, applied to radio wave measurement systems, satellite radio beacon positioning systems, instruments, etc., can solve the problems of large errors and low precision, and achieve the effect of alleviating low precision

Active Publication Date: 2022-04-15
BEIJING GREEN VALLEY TECH CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the object of the present invention is to provide a SLAM mapping method and system based on multi-sensor fusion, so as to alleviate the technical problems of low precision and large errors in the prior art that are easily restricted by the surrounding environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Slam mapping method and system based on multi-sensor fusion
  • Slam mapping method and system based on multi-sensor fusion
  • Slam mapping method and system based on multi-sensor fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0027] figure 1It is a flow chart of a SLAM mapping method based on multi-sensor fusion provided according to an embodiment of the present invention, and the method is applied to a server. Such as figure 1 As shown, the method specifically includes the following steps:

[0028] Step S102, acquiring multiple sensor data about the surrounding environment of the mobile platform; the multiple sensor data include: point cloud data, image data, IMU data and GNSS data.

[0029] Specifically, the point cloud information of the surrounding environment is collected through the laser to obtain point cloud data; the image information is collected through the camera to obtain image data; the angular velocity and acceleration of the mobile platform are obtained through the IMU to obtain IMU data; the absolute latitude and longitude at each moment is obtained through GNSS Coordinates to get GNSS data.

[0030] Step S104, performing hierarchical processing on multiple sensor data to genera...

Embodiment 2

[0095] Image 6 It is a schematic diagram of a SLAM mapping system based on multi-sensor fusion provided according to an embodiment of the present invention, and the system is applied to a server. Such as Image 6 As shown, the system includes: an acquisition module 10 , a layer processing module 20 , a positioning module 30 , a first generation module 40 and a second generation module 50 .

[0096] Specifically, the acquiring module 10 is configured to acquire multiple sensor data about the surrounding environment of the mobile platform; the multiple sensor data include: point cloud data, image data, IMU data and GNSS data.

[0097] The hierarchical processing module 20 is configured to perform hierarchical processing on multiple sensor data to generate multiple positioning information; wherein one sensor data corresponds to one positioning information.

[0098] The positioning module 30 is configured to obtain target positioning information of the mobile platform based on ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a SLAM drawing method and system based on multi-sensor fusion, applied to a server, including: acquiring multiple sensor data about the surrounding environment of the mobile platform; multiple sensor data including: point cloud data, image data, IMU data and GNSS data; perform hierarchical processing on multiple sensor data to generate multiple positioning information; among them, one sensor data corresponds to one positioning information; based on multiple positioning information, the target positioning information of the mobile platform is obtained; based on the target positioning information, generate Local high-precision map; perform closed-loop detection operation on the local high-precision map to obtain a high-precision global map of the mobile platform. The invention alleviates the technical problem of low precision caused by the limitation of the surrounding environment existing in the prior art.

Description

technical field [0001] The invention relates to the technical field of navigation multi-sensor fusion, in particular to a SLAM mapping method and system based on multi-sensor fusion. Background technique [0002] SLAM (SIMUltaneous Localization And Mapping) technology refers to real-time positioning and map construction, that is, by processing the surrounding environment data collected by sensors, real-time feedback of the position of the current motion system in the unknown environment and drawing a map of the surrounding environment of the motion system at the same time, this map can be 2D plane map, also can be 3D surrounding environment map. It has been widely used in robotics, autonomous driving, virtual reality, surveying and mapping, agriculture, forestry, electric power, construction and other industries. Currently commonly used sensor units include lasers, inertial navigation systems (Inertial Measurement Unit, IMU), visual cameras, and global navigation satellite ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/00G01C21/16G01S19/45G01S19/47
CPCG01C21/005G01C21/165G01S19/45G01S19/47G01S5/16G01S19/485G01S19/49G01C25/00G01C21/1652G01C21/1656G01C21/3837G01C21/3848G06T7/579G06T2207/30244G06T2207/10016
Inventor 刘继廷其他发明人请求不公开姓名
Owner BEIJING GREEN VALLEY TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products