Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Panoramic inertial navigation SLAM method based on multiple key frames

A key frame, panorama technology, applied in the direction of navigation through velocity/acceleration measurement, camera devices, etc., can solve the problems of large influence of scale estimation, visual SLAM method cannot be used normally, light sensitivity, etc., to improve accuracy and robustness performance, verify the real-time performance of the system, and take into account the effect of computing efficiency

Active Publication Date: 2019-02-05
HEFEI INSTITUTES OF PHYSICAL SCIENCE - CHINESE ACAD OF SCI +1
View PDF7 Cites 52 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] At present, traditional vision methods are based on RGBD cameras to obtain depth data and then convert them into two-dimensional laser data, which does not make full use of visual information, nor is it a true visual SLAM method framework, such as the patent application number 201710835285.4 "A Kinect-based Indoor mobile robot visual SLAM method"; most scholars are also studying monocular SLAM, but this method only uses a single camera, has a limited field of view, and cannot directly estimate the distance of the scene features, such as the patent application number 201710658521.X " Monocular Visual SLAM Algorithm Based on Semi-Direct Method and Sliding Window Optimization"; the document "ORB-SLAM: a versatile and accurate monocular SLAM system" proposes a multi-threaded visual SLAM method based on key frames, using ORB features with high stability and Taking into account real-time, but sensitive to light, easy to lose in extreme situations such as fast rotation
None of the above patents and documents solve the problem of limited field of view, and cannot be applied to weak texture environments. At the same time, when the vision is degraded (motion blur, large scenes), the scale estimation has a greater impact, resulting in the failure of the normal use of the visual SLAM method.
[0005] Therefore, how to provide a stable and reliable visual SLAM method to solve the insufficient information obtained by the environmental perception method in the prior art and the uncertainty of rapid rotation, and to ensure the accuracy and stability of the visual SLAM method has become an art field. technical issues to be resolved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Panoramic inertial navigation SLAM method based on multiple key frames
  • Panoramic inertial navigation SLAM method based on multiple key frames
  • Panoramic inertial navigation SLAM method based on multiple key frames

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] This embodiment provides a multi-keyframe-based panoramic inertial navigation SLAM method, please refer to figure 1 , is shown as a flow diagram of the multi-keyframe-based panoramic inertial navigation SLAM method. like figure 1 Described, described panoramic inertial navigation SLAM method comprises the following steps:

[0035] S1, panoramic inertial navigation information collection and feature extraction. In this embodiment, through hardware synchronous triggering, the inertial unit is operated in real time at a frequency of 100HZ, and the three cameras are operated in real time at a frequency of 20HZ. AGAST corner points are used to extract features from each camera image and descriptors are expressed using the task-based BRIEF online learning algorithm. In this embodiment, the multi-eye panoramic vision projection determines the multi-body frame by the external reference relationship of the three cameras, and extracts the projection relationship between the fea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a panoramic inertial navigation SLAM method based on multiple key frames. The method comprises the following steps of: building a multi-target panoramic visual model, combiningan inertial navigation unit and tightly coupling a fusion visual feature, and realizing a high-precision positioning and a scene map through multiple key frames nonlinear optimization. The method isused for solving the problems in the prior art of the insufficient information acquisition by environment-aware methods and the uncertainty of fast rotation, and limited view limitation, illuminationinfluence and visual degradation of an existing visual SLAM method, providing an effective automation method for a robot environment modeling and positioning navigation in a complex environment such as strong light / weak texture / motion blur, improving the accuracy and the robustness of the method, so that ensuring the sufficiency of autonomous navigation of the robot, and considering the calculation efficiency, can achieve the real-time performance of the visual SLAM method.

Description

technical field [0001] The invention belongs to the field of computer vision detection, relates to a visual perception SLAM method, in particular to a multi-keyframe-based panoramic inertial navigation SLAM method. Background technique [0002] SLAM is the key to realizing the autonomy of mobile platforms. In recent years, it has shown great application value in many fields, especially navigation robots and virtual reality. Simultaneous Localization and Mapping (SLAM) refers to a method in which the mobile platform uses the body or external sensors to perceive and construct a consistent environmental map and determine its own relative pose when the pose is unknown. Compared with laser SLAM, visual SLAM can obtain unstructured spatial information, provide semantic annotation scenes, and have greater application significance in complex environments. For this reason, to achieve the robustness and accuracy of visual SLAM, a general and real-time visual SLAM method is needed. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/16G01C11/02
CPCG01C11/02G01C21/16
Inventor 张文刘勇张超凡王凡夏营威
Owner HEFEI INSTITUTES OF PHYSICAL SCIENCE - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products