Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Key frame extraction method for RGBD 3D reconstruction

An extraction method and three-dimensional reconstruction technology, applied in the field of key frame extraction for RGBD three-dimensional reconstruction, can solve the problems of depth image RGB image motion blur, low key frame quality, texture extraction influence, etc., to reduce motion blur, improve accuracy, The effect of reducing holes and noise

Active Publication Date: 2017-06-20
BEIHANG UNIV
View PDF8 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The above key frame extraction methods usually face the problem of low quality of key frames, such as noise and holes in the depth image, and motion blur in the RGB image, etc., which have a certain impact on the optimization of the camera pose and texture extraction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Key frame extraction method for RGBD 3D reconstruction
  • Key frame extraction method for RGBD 3D reconstruction
  • Key frame extraction method for RGBD 3D reconstruction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] Embodiments of the present invention will be described in detail with reference to the accompanying drawings.

[0024] The implementation process of the present invention is mainly divided into four steps: RGBD data frame grouping, projection depth image calculation, projection RGB image calculation, and projection data fusion.

[0025] Step 1. Grouping of RGBD data frames

[0026] For a given registered RGBD data stream Input 1 ~Input n , several frames of RGB images with adjacent time stamps (in C 1 ~C k as an example), depth image (take D 1 ~D k as an example) and the corresponding camera pose (take T 1 ~T k example) into a group.

[0027] Step 2. Projection depth image calculation

[0028] Its main steps are:

[0029] Step (2.1) According to the depth camera internal parameters K d Will D 1 ~D k Each pixel in is mapped to the three-dimensional space, specifically:

[0030] p=K d *(u,v,d) T (1)

[0031] Among them, p is the mapped three-dimensional ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a key frame extraction method for RGBD 3D reconstruction. The method includes at first, dividing a plurality of data frames that are temporally adjacent to each other into a group for the RGBD data flow acquired by a camera and the camera pose estimated by a visual odometer; and for each group of data, projecting each frame of depth image to a first frame of depth image according to the camera pose and the camera parameters; projecting each group of first frame of RGB image to the remaining frames of image to obtain the gray value of each projected RGB image by linear interpolation; estimating the blur degree of each frame of RGB image, and combining with the weight of the corresponding projected depth image to obtain the weight of the projected RGB image; and fusing the intra-group projected depth images and the projected RGB images according to the obtained weights to obtain the RGBD key frame. The key frame extraction method reduces the void and noise of data acquired by a depth camera to obtain clear depth images and RGB images, and provides reliable data source for other works in the 3D reconstruction such as the camera pose global optimization and texture extraction.

Description

technical field [0001] The invention belongs to the field of computer vision and computer graphics and image processing, and specifically relates to a method for extracting key frames in an RGBD data stream, which provides research on camera pose estimation optimization and texture reconstruction in three-dimensional reconstruction based on RGBD data streams. In order to obtain a more reliable data source, it is of great significance to study the 3D reconstruction technology based on RGBD data. Background technique [0002] With the popularity of depth sensors and the development of 3D reconstruction technology, research on 3D model reconstruction based on RGBD data is emerging in recent years. Compared with the traditional 3D reconstruction based on RGB images, the depth image provides the 3D information of the scene, which greatly improves the feasibility and accuracy of 3D reconstruction. Keyframe extraction plays an important role in camera pose estimation, camera reloc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/579G06T17/00
CPCG06T17/00G06T2207/10024G06T2207/10028G06T2207/20068G06T2207/20221
Inventor 齐越韩尹波王晨
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products