Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

SLAM autonomous navigation identification method in closed scene

A technology of autonomous navigation and identification method, applied in the field of image processing, to achieve the effect of strong practicability and simple calculation

Pending Publication Date: 2022-06-03
山东融瓴科技集团有限公司
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem that the above-mentioned current visual SLAM methods are usually based on static assumptions, the present invention provides a SLAM autonomous navigation recognition method in a closed scene

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • SLAM autonomous navigation identification method in closed scene
  • SLAM autonomous navigation identification method in closed scene
  • SLAM autonomous navigation identification method in closed scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] Attached below figure 1 The present invention is further described, a SLAM autonomous navigation and identification method in a closed scene, comprising the following steps:

[0034] Step 1: Obtaining external environment data, the autonomous robot obtains external environment data through its own camera;

[0035] Step 2: Feature detection, input the acquired external environment data into the SLAM feature extraction module, and the SLAM feature extraction module uses the Vision transformer model to realize the semantic detection of each object in the external environment, and extract the features of each object at the same time;

[0036] Step 3: Data association, the SLAM data association module tracks the common features of images in different frames, and achieves the same feature matching through the correlation clustering between frames, and then judges the movement of the object; for the successfully matched features , the controller of the autonomous robot correc...

Embodiment 2

[0060] The difference between this embodiment and Embodiment 1 is that the cluster value k=100 is set.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of image processing, and particularly relates to an SLAM autonomous navigation identification method in a closed scene, an SLAM data association module in data association of the method tracks common features of images in different frames, the same feature matching is realized through correlation clustering between the frames, and the SLAM autonomous navigation identification method in the closed scene is realized. The moving condition of the autonomous robot is judged; compared with the prior art, the method has the advantages that (1) compared with a data association method in the existing SLAM (Simultaneous Localization and Mapping), the clustering matching has higher practicability, and the clustering matching effect is not influenced by various complex scenes; and (2) the K-means clustering algorithm is simple in calculation, can be embedded into various systems for feature matching, and is very suitable for visual SLAM in a dynamic environment.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a SLAM autonomous navigation and identification method in a closed scene. Background technique [0002] Simultaneous Localization and Mapping (SLAM) refers to the use of visual sensors mounted on the robot to perceive the surrounding environment without prior information, establish an environment model during the movement process, and estimate the location of the robot itself. technology. [0003] Current visual SLAM methods are usually based on the static assumption, that is, objects in the default environment are in a static state during the entire visual SLAM process. However, most of the practical application environments are dynamic environments, and there are dynamic objects, such as walking people, driving cars, etc. These dynamic objects affect the accuracy of mapping. SUMMARY OF THE INVENTION [0004] Aiming at the above-mentioned problem that th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V20/56G06V10/762G06V10/74G06K9/62
CPCG06F18/22G06F18/23213
Inventor 高文飞王磊王辉王瑞雪郭丽丽
Owner 山东融瓴科技集团有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products