Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

PTAM improvement method based on ground characteristics of intelligent robot

A technology of intelligent robots and robots, applied in the field of robot vision, can solve problems such as inability to build metric maps, camera movement restrictions, etc.

Inactive Publication Date: 2015-06-24
BEIJING UNIV OF TECH
View PDF6 Cites 88 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the unimproved PTAM algorithm also has the problem of being unable to establish a metric map, and has strict restrictions on the movement of the camera

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • PTAM improvement method based on ground characteristics of intelligent robot
  • PTAM improvement method based on ground characteristics of intelligent robot
  • PTAM improvement method based on ground characteristics of intelligent robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0093] The patent of the present invention will be described in further detail below in conjunction with the accompanying drawings.

[0094] The flow chart of improved PTAM algorithm based on ground features is attached figure 1 As shown, it specifically includes the following steps:

[0095] Step 1, parameter correction

[0096] Step 1.1, parameter definition

[0097] From the relationship between the robot coordinate system and the world coordinate system, the pose representation of the robot is constructed, and the ground plane calibration parameters are determined by the pose relationship between the camera and the target plane.

[0098] Step 1.2, camera calibration

[0099] The FOV model is used to realize the correction of the monocular camera, the image pixel coordinates are mapped to the normalized coordinate plane, and the camera internal parameter matrix K is combined to realize the image distortion correction.

[0100] Step 2. Initialization based on ground features

[0101] S...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a PTAM improvement method based on ground characteristics of an intelligent robot. The PTAM improvement method based on ground characteristics of the intelligent robot comprises the steps that firstly, parameter correction is completed, wherein parameter correction includes parameter definition and camera correction; secondly, current environment texture information is obtained by means of a camera, a four-layer Gausses image pyramid is constructed, the characteristic information in a current image is extracted by means of the FAST corner detection algorithm, data relevance between corner characteristics is established, and then a pose estimation model is obtained; two key frames are obtained so as to erect the camera on the mobile robot at the initial map drawing stage; the mobile robot begins to move in the initializing process, corner information in the current scene is captured through the camera and association is established at the same time; after a three-dimensional sparse map is initialized, the key frames are updated, the sub-pixel precision mapping relation between characteristic points is established by means of an extreme line searching and block matching method, and accurate re-positioning of the camera is achieved based on the pose estimation model; finally, matched points are projected in the space, so that a three-dimensional map for the current overall environment is established.

Description

Technical field [0001] The invention belongs to the field of robot vision and relates to an improvement of a PTAM algorithm based on ground features. Background technique [0002] As the relationship between robots and humans gets closer, the related technologies of intelligent robots have received great attention. Simultaneous Localization and Mapping (SLAM) of mobile robots is one of the most mainstream positioning technologies for smart mobile robots. It is actually a motion estimation problem. It uses the internal and external data obtained by the sensor to calculate the position of the mobile robot at a certain moment, and at the same time establishes the map model it depends on. Vision-based SLAM belongs to the research category of vision measurement. Because the vision sensor has its own unique advantages: small size, light weight, low price, easy installation, and very rich external information extracted. These advantages further promote the current research and applica...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G01C21/00
Inventor 贾松敏王可宣璇张鹏董政胤
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products