Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Camera pose estimation method oriented to RGBD (Red, Green and Blue-Depth) data stream

A pose estimation and data flow technology, which is applied in image data processing, computing, instruments, etc., can solve the problems of time-consuming camera pose estimation method and meet the requirements of real-time performance, so as to reduce the amount of calculation and improve the accuracy. degree of effect

Active Publication Date: 2017-05-31
BEIHANG UNIV
View PDF6 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Some of the above camera pose estimation methods are time-consuming, such as the extraction of feature points, the optimization of energy equations and other steps, and some have high requirements on the quality of input data, such as consistent illumination and high resolution, etc., which are usually difficult to move Meet real-time requirements on platforms with low computing efficiency such as equipment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Camera pose estimation method oriented to RGBD (Red, Green and Blue-Depth) data stream
  • Camera pose estimation method oriented to RGBD (Red, Green and Blue-Depth) data stream
  • Camera pose estimation method oriented to RGBD (Red, Green and Blue-Depth) data stream

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] Embodiments of the present invention will be described in detail with reference to the accompanying drawings.

[0023] Such as Figure 4 As shown, the implementation process of the present invention is mainly divided into four steps: depth data preprocessing, building a three-dimensional point cloud map, corresponding point matching, and optimizing distance errors.

[0024] Step 1. In-depth data preprocessing

[0025] Its main steps are:

[0026] (1) For the depth data in the given input RGBD (color + depth) data stream, set the threshold w according to the error range of the depth camera min ,w max , the depth value is at w min with w max The points between are regarded as credible values, and only the depth data I within the threshold range are kept.

[0027] (2) Perform fast bilateral filtering on each pixel of the depth data, as follows:

[0028]

[0029] where p j for pixel p i The pixels in the neighborhood of , s is the number of effective pixels in t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a camera pose estimation method oriented to an RGBD (Red, Green and Blue-Depth) data stream. The method comprises the construction of a three-dimensional point cloud mapping graph and the registration of a three-dimensional point cloud mapping graph. The method comprises the following steps that: firstly, carrying preprocessing operations including filtering and the like on depth data, and then, constructing the three-dimensional point cloud mapping graph by the preprocessed depth image according to the internal parameter of a camera; according to the camera pose of a previous frame, adopting a projection mapping algorithm to carry out corresponding point matching on the two frames of three-dimensional point cloud mapping graphs, carrying out energy optimization on a point-to-plane distance error function, and establishing the six parameters of a transformation matrix between two frames of point clouds; and according to the camera pose of the previous frame, estimating the camera pose of a current frame, and carrying out iteration on above process optimization to obtain a final camera pose. The method can be operated in real time on existing mobile equipment, and a good result can be obtained when the resolution of input data is low.

Description

technical field [0001] The invention belongs to the field of computer vision and computer graphics image processing, in particular to a camera real-time pose estimation method for RGBD data streams, the method can be used when the resolution of input data is low, depth data has holes and noise, equipment operation Estimating the camera pose in real time with low capability is of great significance for the research of SLAM and real-time 3D reconstruction technology. Background technique [0002] With the popularity of depth sensors and the development of 3D reconstruction technology, research on 3D model reconstruction based on depth data is emerging in recent years. Compared with traditional 3D reconstruction based on RGB data, depth data provides 3D information of the scene, which greatly improves the feasibility and accuracy of 3D reconstruction. The estimation of camera pose plays a vital role in related application scenarios such as 3D reconstruction. [0003] At prese...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/33G06T7/70
CPCG06T2207/10028G06T2207/30244
Inventor 齐越韩尹波王晨李宏毅
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products