Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

SLAM method based on fusion of monocular vision feature method and direct method

A technology of monocular vision and feature method, applied in surveying and mapping, navigation, photo interpretation, image data processing, etc., can solve problems such as failure, and achieve the effect of eliminating accumulated errors

Inactive Publication Date: 2021-02-26
TIANJIN UNIV
View PDF9 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when there are few textures in the environment or the texture repetition is high (such as figure 1 ), feature-based SLAM can only rely on limited feature points, and when the feature points are less than a certain threshold, this method will fail;

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • SLAM method based on fusion of monocular vision feature method and direct method
  • SLAM method based on fusion of monocular vision feature method and direct method
  • SLAM method based on fusion of monocular vision feature method and direct method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] An embodiment of the present invention provides a SLAM method based on the integration of a monocular vision-based feature method and a direct method. The method includes the following steps:

[0031] 101: The initial image is input into the monocular camera sensor, and the algorithm provided by Opencv converts the initial image into a grayscale image; then extracts DSO features from the grayscale image;

[0032] In the specific implementation, the extraction of DSO features only needs to know the gray gradient existing in the gray image, so it is easier to extract than ORB features and contains ORB features.

[0033] Wherein, Opencv is a programming function library for real-time computer vision, which is well known to those skilled in the art, and will not be described in detail in this embodiment of the present invention.

[0034] 102: Extract ORB features from DSO features, and perform feature pair matching processing on DSO features and ORB features, specifically: ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an SLAM method based on fusion of a monocular vision feature method and a direct method, and the method comprises the steps of extracting DSO features from a gray level image,extracting ORB features from the DSO features, and carrying out the feature pair matching of the DSO features and the ORB features; inserting the key frame into the local map, updating the existing map points, removing redundant map points, extracting new map points from the key frame, and updating the map points; projecting the existing map points to the key frame, constructing local bundle adjustment, and adjusting the pose of the camera; removing redundant key frames, inquiring the database, carrying out Sim < 3 > similarity calculation, detecting whether a closed loop exists or not, if so,carrying out closed loop fusion, optimizing an essential diagram, and ending the process; if not, entering the next step; and constructing a global bundle adjustment, adjusting the pose of the camerato minimize the overall error of the SLAM system, and updating the map.

Description

technical field [0001] The present invention relates to the field of autonomous positioning and mapping of mobile devices, in particular to a SLAM method based on the integration of a monocular vision-based feature method and a direct method. Background technique [0002] SLAM (Simultaneously Location and Mapping, translated as "simultaneous positioning and map construction" in Chinese), is known as one of the core technologies for mobile devices to realize autonomous movement. In recent years, with the joint efforts of a group of outstanding researchers, it has gradually matured. And in the future, it can be widely used in 3D reconstruction, AR / VR (Augmented Reality, augmented reality / Virtual Reality, virtual reality) equipment and mobile robots. [0003] Due to the different sensors used, such as inertial measurement unit (Inertia Measurement Unit, IMU), laser radar (Laser Lidar), camera, etc., SLAM can be divided into INS (Inertia Navigation System, inertial navigation sy...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/05G06T7/73G06T5/50G01C11/04G01C21/00
CPCG06T17/05G06T7/73G06T5/50G01C21/005G01C11/04G06T2207/20221
Inventor 张鹏王磊付忠霖梁雄
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products