Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot positioning and mapping method and device based on point-line feature fusion

A robot positioning and feature fusion technology, applied in the computer field, can solve problems such as tracking failure, inaccurate positioning, and inability to determine the law

Active Publication Date: 2020-03-06
HEFEI UNIV OF TECH
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the existing visual SLAM using RGB-D cameras, various schemes such as ORB-SLAM, PL-SLAM and PTAM (Parallel Tracking and Mapping) are used to realize the process of parallel tracking and mapping, and non-linear optimization is used to The back-end optimization meets the real-time requirements of visual SLAM, but in the process of implementing this application, the inventor found that there are at least the following problems in the existing technology: In low-texture, motion blur and other scenes, using the above method to achieve visual SLAM Relocation can easily lead to tracking failure, making the positioning imprecise or even impossible to determine. Therefore, in low-texture, motion blur and other scenes, how to accurately locate and map the robot has become an urgent problem to be solved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot positioning and mapping method and device based on point-line feature fusion
  • Robot positioning and mapping method and device based on point-line feature fusion
  • Robot positioning and mapping method and device based on point-line feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the technical field of the application; the terms used herein in the description of the application are only to describe specific embodiments The purpose is not to limit the present application; the terms "comprising" and "having" and any variations thereof in the specification and claims of the present application and the description of the above drawings are intended to cover non-exclusive inclusion. The terms "first", "second" and the like in the description and claims of the present application or the above drawings are used to distinguish different objects, rather than to describe a specific order.

[0071] Reference herein to an "embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the present application. The appearanc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a robot positioning and mapping method and device based on point-line feature fusion, a computer device and a storage medium. The method comprises the following steps; using an RGB-D camera for carrying out surrounding environment detection; acquiring an RGB image and a depth image; continuous image frames are determined; extracting point features from continuous image frames; using a self-adaptive line segment detection method; extracting line features from the continuous image frames; using a feature matching algorithm and a screening mechanism; performing feature matching on the line features; obtaining an inter-frame initial pose; and finally, carrying out minimum error processing on the inter-frame initial pose by adopting a dot-line error model, to obtain theinter-frame pose and the map point, and generating the local map based on the inter-frame pose and the map point, so that the tracking robustness is improved through point-line feature fusion. The problem of tracking failure in scenes such as low texture and motion blur is avoided, and the positioning and mapping accuracy of the robot is effectively improved.

Description

technical field [0001] The present application relates to the field of computer technology, in particular to a method, device, computer equipment and storage medium for robot positioning and mapping based on point-line feature fusion. Background technique [0002] In recent years, technologies such as unmanned driving, robots, drones, and AR / VR have developed rapidly. At the same time, positioning and map construction have also become hot research issues and are considered to be key basic technologies in these fields. This is because in an unknown environment, accurate positioning of the robot requires an accurate environmental map, and to build an accurate environmental map, the robot also needs to know its exact position in the environment. The SLAM (Simultaneous Localization and Mapping) technology enables robots and other carriers to set off at unknown locations in an unknown environment, and use a series of sensors (lidar, GPS, IMU, camera, etc.) on them to observe the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06T17/05
CPCG06T17/05G06V20/10G06V10/56G06V10/44Y02T10/40
Inventor 方宝富王浩杨静詹志强王乐韩修萌
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products