Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot scene self-adaptive pose estimation method based on RGB-D camera

A technology for pose estimation and robotics, applied in instrumentation, computing, image data processing, etc., can solve problems such as algorithm failure, sparse regional scene features, and inability to obtain pose estimation

Pending Publication Date: 2019-09-10
HUNAN UNIV +1
View PDF3 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the defect of this type of method is that it relies heavily on the selection of feature points. First, the mismatched point pairs in the feature point set will have a serious impact on the initial value of the 3D estimation; secondly, the algorithm is only effective for scenes with many image feature points. If the scene If the feature points are sparse, it is easy to fail to track or the error is extremely large, which will directly lead to the failure of the algorithm, so that a stable pose estimation cannot be obtained.
In the actual robot operating environment, some regional scene features are rich, and some regional scene features are sparse, and the single-considered pose estimation method is difficult to work

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot scene self-adaptive pose estimation method based on RGB-D camera
  • Robot scene self-adaptive pose estimation method based on RGB-D camera
  • Robot scene self-adaptive pose estimation method based on RGB-D camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0111] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0112]The RGB-D camera can simultaneously acquire a two-dimensional color image I and a three-dimensional point cloud D of the scene, wherein the two-dimensional color image I t (u,v) and 3D point cloud D t (u, v) corresponds to the pixels of the two-dimensional color image one-to-one, that is, the pixel point I of the u-th row and the v-th column in the two-dimensional color image t (u,v) and 3D point cloud D (u,v) (x, y, z) corresponding, three-dimensional point cloud D (u,v) (x, y, z) refers to the depth information of the pixels in row u and column v in the two-dimensional color image; the three-dimensional point cloud D refers to a set of three-dimensional space points corresponding to all pixels in the two-dimensional color image.

[0113] Such as figure 1 Shown, be flow chart of the present invention, a kind of robot scene adaptive pose estim...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot scene self-adaptive pose estimation method based on RGB-D camera. scene two-dimensional color image information of adjacent frames and space depth information corresponding to pixels of the two-dimensional color image are acquired based on a RGB-D camera; when the two-dimensional color image feature points are sufficient, ORB operators are adopted to extract features, the matching strategy provided by the invention is adopted to carry out accurate matching, and three-dimensional pose estimation is solved based on the pose estimation algorithm of the matching feature points; when the feature points are insufficient, the improved ICP algorithm provided by the invention is adopted to solve the three-dimensional pose estimation; then, a complete switching criterion is designed to fuse the two pose estimation methods; finally, pose estimation obtained through the two methods is optimized through a light beam adjustment algorithm, and finally smooth and accurate three-dimensional pose estimation is obtained. The three-dimensional pose estimation algorithm has the outstanding advantages of high robustness, high precision, small calculation amount, adaptability to different scenes and the like.

Description

technical field [0001] The invention belongs to the field of robot control, in particular to a robot scene adaptive pose estimation method based on an RGB-D camera. Background technique [0002] Real-time robust and high-precision 3D pose estimation is one of the research difficulties and hotspots in the field of robotics. Its goal is to estimate the change in 3D space pose of the robot at two adjacent moments in real time. It is a robot SLAM (real-time positioning and map Creation), motion tracking, AR (Augmented Reality) and other core content. Traditional navigation systems based on inertial sensors are widely used in pose estimation, but there are problems such as drift and error accumulation, so the accuracy and reliability of the corresponding pose estimation are low. Compared with the inertial navigation system, the vision-based pose estimation does not have the problem of physical drift, and the cumulative error can be effectively eliminated through the global visua...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73
CPCG06T7/73G06T2207/10028G06T2207/10024Y02T10/40
Inventor 余洪山付强林鹏孙炜杨振耕赖立海陈昱名吴思良
Owner HUNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products