Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-sensor fusion SLAM algorithm and system thereof

A multi-sensor fusion and algorithm technology, applied in radio wave measurement systems, satellite radio beacon positioning systems, instruments, etc., to achieve reliable results and solve the limitations of a single sensor

Inactive Publication Date: 2021-02-02
的卢技术有限公司
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, in the vehicle ADAS system, positioning is an extremely important part. Under the premise of lack of maps, SLAM technology provides a good solution. Among the current mainstream technologies, there are vision-based and laser-based, but purely There are more or less problems with vision or pure laser solutions. For example, in pure vision solutions, rich textures are required in the scene
In the laser solution, there is no laser loopback detection method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensor fusion SLAM algorithm and system thereof
  • Multi-sensor fusion SLAM algorithm and system thereof
  • Multi-sensor fusion SLAM algorithm and system thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] refer to figure 1 The illustration is a SLAM fusion algorithm based on multi-type cameras proposed in this embodiment, which solves the bottleneck solution of a single type of camera with low hardware cost; and only three groups of cameras are needed to meet the requirements of multiple scenes and solve the problem of single-type cameras. The application restrictions of the type of camera meet the requirements of the vehicle's advanced driver assistance system. GPS+IMU combined into high-precision inertial navigation. In the case of access to differential RTK, the positioning accuracy can reach centimeter-level positioning. Binocular camera + IMU can provide positioning in the case of weak GPS and rich texture. If the texture is not rich enough, the vision cannot provide reliable information. In a short period of time, IMU positioning can be relied on. Various scenarios are considered and multiple data are fused , the positioning results are reliable and can solve the...

Embodiment 2

[0114] refer to Figure 7-9 In this embodiment, it should be further explained that the hardware synchronization module 200 is realized by using a synchronous exposure module hardware circuit and a GPS time correction module. The time information receiving module of the single-chip microcomputer receives the time information and time pulse of the GPS time receiver, and the RTC ( The real-time clock) chip is connected to the RTC module of the single-chip microcomputer, and the single-chip exposure signal control module controls the exposure part of the 4-way camera, and the 4-way camera feedback signal is connected to the feedback signal detection module of the single-chip microcomputer. It is not difficult to find that this embodiment is the feedback signal of the 3-way camera .

[0115] The specific process is that when the single-chip microcomputer receives the external request exposure signal, the 3-way pins will immediately act and delay. Since the camera exposure action h...

Embodiment 3

[0132] refer to Figure 10-11 For illustration, this embodiment proposes a SLAM fusion system based on multi-type cameras. The method of the above embodiment can be implemented relying on this system, and its system framework includes image processing: monocular is mainly responsible for image correction, ORB feature point Extraction, for the binocular camera, also includes the calculation of the depth map, and the depth camera includes the alignment of the color map and the depth map; the decision-making unit: mainly selects the appropriate camera combination according to the scene; the data fusion unit: multi-camera image fusion. Specifically, the system includes a camera module 100, a hardware synchronization module 200, and a SLAM fusion module 300; wherein the camera module 100 is arranged on the vehicle body 400 and is parallel to the horizontal plane of multiple types of cameras, and also includes a monocular camera module 101 and a binocular camera module 102 , RGB-D c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-sensor fusion SLAM algorithm and system, and the method comprises the following steps: a plurality of types of camera modules are respectively installed at the front side of a vehicle body; camera calibration is performed on the camera module; the camera module collects image information, corrects the collected image, extracts ORB feature points, calculates a depthmap, and aligns the collected color map with the depth map; and the hardware synchronization module synchronizes different types of image information acquired by the camera module on timestamps. The method has the beneficial effects that various scenes are considered, various data are fused, the positioning result is reliable, and the limitation of a single sensor can be solved.

Description

technical field [0001] The invention relates to the technical field of simultaneous positioning and mapping technology, in particular to a multi-sensor fusion SLAM algorithm and fusion system. Background technique [0002] In the vehicle advanced driving assistance system, positioning and mapping are indispensable technologies. For example, when the GPS is weak or there is no GPS, the vehicle needs to rely on SLAM technology for navigation and path planning. The advanced driving assistance system uses various sensors installed on the car, such as millimeter-wave radar, laser radar, single / binocular camera and satellite navigation, to sense the surrounding environment at any time during the driving process of the car, collect data, and carry out The identification, detection and tracking of static and dynamic objects, combined with the map data of the navigator, is used for systematic calculation and analysis, so that the driver can be aware of possible dangers in advance, ef...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01S19/45G01S19/47G01C21/34G06K9/62
CPCG01S19/45G01S19/47G01C21/3446G06F18/25
Inventor 张裕王宇航王斌
Owner 的卢技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products