Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular vision inertia SLAM method for dynamic scene

A monocular vision and dynamic scene technology, applied in neural learning methods, measuring devices, instruments, etc., can solve problems such as insufficient stability of robot SLAM systems

Active Publication Date: 2020-05-15
SOUTHEAST UNIV
View PDF5 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to solve the problem of insufficient stability of the existing robot SLAM system in dynamic scenes, the present invention proposes a dynamic scene-oriented monocular visual inertial SLAM method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular vision inertia SLAM method for dynamic scene
  • Monocular vision inertia SLAM method for dynamic scene
  • Monocular vision inertia SLAM method for dynamic scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0125] The technical solutions provided by the present invention will be described in detail below in conjunction with the accompanying drawings. It should be understood that the following specific embodiments are only used to illustrate the present invention and not to limit the scope of the present invention.

[0126] The present invention provides a dynamic scene-oriented monocular vision SLAM method, the realization principle is as follows figure 1 As shown, the process mainly includes the following steps:

[0127] Step S1: Use the collected data set to train the neural network, which specifically includes the following processes:

[0128] S1.1 Collect sample data sets and create a total of 3 data sets, including training set, test set, and verification set. The three samples are independent of each other and do not contain each other. In order to prevent network over-fitting, the collected data is augmented, and the following three methods are randomly used to process each ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a monocular vision inertia SLAM method for a dynamic scene. The method comprises the following steps: firstly, extracting ORB feature points by a visual front end, performing target identification by using a YOLO-v3 neural network, further extracting a potential static feature point set, removing RANSAC outer points of an essential matrix, screening out final static featurepoints, and tracking the final static feature points; meanwhile, in order to improve the data processing efficiency, carrying out pre-integration on IMU measurement values; initializing, and calculating initial values including attitude, speed, gravity vector and gyroscope offset; then, carrying out nonlinear optimization of visual inertia tight coupling, and establishing a map; meanwhile, carrying out loopback detection and repositioning, and finally carrying out global pose graph optimization. According to the method, deep learning and visual inertia SLAM are fused, the influence of a dynamic object on SLAM positioning and mapping can be eliminated to a certain extent, and the stability of long-time work of the system is improved.

Description

Technical field: [0001] The invention relates to a monocular visual inertial SLAM method for dynamic scenes, which belongs to the technical field of simultaneous positioning and map construction in mobile robots. Background technique: [0002] In recent years, computer vision and robotics have become a hot research direction, and among them, the most basic and low-level research is to solve the positioning problem of the robot itself. Among them, the types of sensors that can be used are very rich, including GPS, UWB, lidar, wheel odometry, etc., among which visual sensors stand out by virtue of their low price and similar positioning methods to humans. Since a single vision sensor is easily affected by external conditions such as picture blurring and lighting, multiple sensors are often used for fusion to improve robustness, and the IMU and vision are the most complementary. Traditional visual-inertial SLAM mainly runs in static scenes and cannot handle complex dynamic scenes. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/00G01C21/16G01C21/20G06N3/08
CPCG01C21/005G01C21/165G01C21/20G06N3/08
Inventor 徐晓苏安仲帅吴贤魏宏宇侯岚华潘绍华
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products