Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature fusion-based RGB-D camera motion estimation method

A technology of camera motion and feature fusion, applied in computing, image data processing, instruments, etc., can solve problems such as being easily affected by light and large noise

Inactive Publication Date: 2018-05-18
CENT SOUTH UNIV
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem that in the existing camera motion estimation method, it is easily affected by illumination and the uneven feature distribution brings large noise.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature fusion-based RGB-D camera motion estimation method
  • Feature fusion-based RGB-D camera motion estimation method
  • Feature fusion-based RGB-D camera motion estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] Embodiments of the present invention will be further described below in conjunction with the accompanying drawings.

[0061] The present invention proposes a RGB-D camera motion estimation method based on point and line feature fusion, and the whole system block diagram is as attached figure 1 shown, including the following steps:

[0062] S1. Two-dimensional feature extraction: the system uses the input RGB image to extract two-dimensional point features and two-dimensional line features respectively.

[0063] In the present invention, a group of two-dimensional point features are obtained through SURF (feature point detection algorithm), and at the same time, two-dimensional straight line features are obtained through LSD (Line Segment Detector) line segmentation detection algorithm. For an RGB image, its feature set {p i , l j |i=1,2,…,j=1,2,…}, where the two-dimensional point p i =[u i ,v i ] T , a two-dimensional straight line [u i ,v i ] T represents t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a feature fusion-based RGB-D camera motion estimation method. The method comprises the steps of firstly extracting two-dimensional points and two-dimensional straight line features in an RGB image, and according to depth information in a D image, reversely projecting the two-dimensional features to obtain three-dimensional features; secondly, constructing errors of three-dimensional points according to RGB measurement errors and depth measurement errors, and measuring uncertainty of a straight line by calculating a Mahalanobis distance between three-dimensional projection of a two-dimensional straight line sampling point and the estimated three-dimensional straight line; and finally, fusing matching pairs of the three-dimensional points and the three-dimensional straight line features of two adjacent frames, and by utilizing uncertainty information, calculating a motion of an RGB-D camera through maximum likelihood estimation. According to the method, the straight line features insensitive to illumination change are fused; an error model of a system is reasonably built; and the robustness and accuracy of camera motion estimation are improved.

Description

technical field [0001] The invention belongs to the technical field of machine vision, and in particular relates to a feature fusion RGB-D camera motion estimation method. Background technique [0002] In recent years, with the rapid development of image processing technology and the emergence of various vision sensors, vision-based mobile robots have received more and more attention. Compared with lidar, millimeter-wave radar, etc., visual sensors can obtain richer environmental information while reducing costs. Visual odometry (Visual Odometry, VO) is to estimate the motion process of the camera or the body connected to it (for example: car, human or mobile robot, etc.) only through the visual sensor. It is a sub-problem of Visual Simultaneous Localization and Mapping (VSLAM), and is the core problem for autonomous navigation of mobile robots. Commonly used vision sensors include monocular cameras, binocular cameras, panoramic cameras and RGB-D cameras. RGB-D cameras, s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06T7/10G06T7/246
CPCG06T7/10G06T7/246G06T7/73
Inventor 陈白帆刘春发宋德臻王斌
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products