Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

RGB-D visual odometry taking ground constraint into consideration in indoor environment

A visual odometer and indoor environment technology, applied in the field of autonomous positioning of mobile robots, can solve problems such as slow algorithm speed, great influence, and difficulty in meeting the real-time output requirements of the odometer, and achieve the effect of improving estimation accuracy and rapidity

Inactive Publication Date: 2017-04-05
HARBIN ENG UNIV
View PDF5 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The RGB-D sensor is used to realize the visual odometry. Feature extraction and matching as the preprocessing steps of data association have a great impact on the solution speed of the system. Traditional algorithms usually use SIFT and SURF algorithms, which are slow and difficult to meet the real-time output of the odometer. Requirements; Since the traditional algorithm uses the corresponding feature point set obtained through matching to directly solve the motion transformation matrix, without adding environmental structural feature geometric constraints, the accuracy of the algorithm needs to be further improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • RGB-D visual odometry taking ground constraint into consideration in indoor environment
  • RGB-D visual odometry taking ground constraint into consideration in indoor environment
  • RGB-D visual odometry taking ground constraint into consideration in indoor environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The present invention will be further described below in conjunction with the accompanying drawings.

[0052] The invention discloses an RGB-D visual odometry method considering ground constraints in an indoor environment, using an RGB-D camera as a sensor input device to realize the visual odometry function, including ORB feature extraction and matching, ground plane detection and constraint addition, RANSAC motion transformation estimation, construction of motion transformation error evaluation function to evaluate motion transformation estimation results, odometer result output and other steps. The invention adopts the ORB algorithm for feature extraction and matching of color images, and improves the rapidity of feature detection on the basis of ensuring the accuracy requirements; uses the depth image to detect point cloud ground, and corrects the output result of visual odometer pose transformation in combination with ground plane constraints, improving Estimated a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field self-localization of a mobile robot, and particularly relates to RGB-D visual odometry taking ground constraint into consideration in an indoor environment. The method comprises: (1) in an indoor environment, collecting color image information and depth image information in the environment during a motion process of an RGB-D camera loaded on a mobile platform and marking adjacent frame images as Is and It; (2) performing solution through a reverse projection function [pi]-1 according to the depth image information to obtain three dimensional point cloud data Vs and Vt of the environment; and (3) extracting and matching ORB characteristics including extracting and matching characteristic points of RGB-D images through an ORB algorithm. Image pre-treatment including characteristic extraction and matching is completed through an ORB algorithm. Compared with SIFT and SURF algorithms, the ORB algorithm is accelerated by one order of magnitude. The point cloud ground is obtained through use of depth image detection and point set alignment is improved through use of ground information, thereby improving the estimation precision of motion transformation.

Description

technical field [0001] The invention belongs to the field of autonomous positioning of mobile robots, in particular to an RGB-D visual odometer method considering ground constraints in an indoor environment. Background technique [0002] In a navigation system, odometry is a method that uses sensor movement data to estimate the amount of change in robot position over time, and is often used in wheeled mobile robots. Traditional odometers use compass or encoders to achieve positioning by measuring the number of rotations of the wheels of a mobile robot. However, when the tires slip, there will be a large deviation that is difficult to eliminate, and due to the accumulation of errors over time, the odometer readings will become increasingly unreliable, which cannot meet the application scenarios that require high precision. Visual Odometry (VO, Visual Odometry) technology is a method of estimating the moving distance of the robot by using continuous image sequences. It can ob...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C22/00G06T7/20G06T7/11G06T7/136G06K9/46
CPCG06T7/20G01C22/00G06T2207/10028G06T2207/10024G06V10/40
Inventor 赵玉新李亚宾刘厂
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products