Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

RGB-D image-based indoor scene three-dimensional reconstruction method

A RGB image and three-dimensional reconstruction technology, applied in the field of computer vision, can solve the problems of inability to solve depth image holes, low reconstruction accuracy, and mis-segmentation, and achieve the effects of optimizing camera pose, improving accuracy, and suppressing error accumulation

Inactive Publication Date: 2019-04-19
HUAZHONG UNIV OF SCI & TECH
View PDF8 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the defects of the prior art, the purpose of the present invention is to solve the technical problems of the prior art that the reconstruction accuracy is not high and cannot solve the mis-segmentation caused by the hole in the depth image

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • RGB-D image-based indoor scene three-dimensional reconstruction method
  • RGB-D image-based indoor scene three-dimensional reconstruction method
  • RGB-D image-based indoor scene three-dimensional reconstruction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0042] First, some terms used in the present invention are explained.

[0043] RGB-D image: including color image (RGB image) and depth image. Usually, color images and depth images are registered so that there is a one-to-one correspondence between pixels.

[0044] Depth image: Depth Image, depth image, hereinafter referred to as D image, is an image or image channel that contains information about the distance to the surface of the scene object of the viewpoint. Each of its pixel values ​​is the sensor's actual distance from the object.

[0045] 3D point cloud: Project each pixel of the depth ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method based on RGB-. According to the indoor scene three-dimensional reconstruction method of the D image, a semantic segmentation result is used for repairing a depth imagecavity, object contour and category information is provided for three-dimensional reconstruction, and the shape and appearance of an object are obtained according to priori knowledge, so that more accurate data is provided for three-dimensional reconstruction. Three-dimensional reconstruction provides three-dimensional space information for semantic segmentation, and the problem of false segmentation caused by object overlapping, illumination influence and the like in two-dimensional image segmentation is solved. Multi-level camera pose estimation and sparse feature matching are used for providing a rough estimation pose, then a precise camera pose is obtained through a dense geometric and photometric optimization method, and a more precise camera pose is provided for model reconstruction. In the reconstruction process, each frame is locally optimized, meanwhile, a key frame mechanism is added, global optimization and closed-loop detection are established, constraint is established for spatial points corresponding to pixels of the key frames, error accumulation is effectively inhibited, the pose of the camera is further optimized, and the precision of a reconstruction result is improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and more specifically relates to a method for three-dimensional reconstruction of indoor scenes based on RGB-D images. Background technique [0002] The principle of the depth camera Kinect is that the infrared emitter emits infrared rays and irradiates the surface of the object to form random reflection speckles, which are then received by the depth sensor, and then the system chip operates to generate a depth image. For transparent materials and planes with missing textures, infrared rays cannot be reflected to form speckle or the effect is poor, resulting in a depth image with holes. At present, most research works use bilateral filtering method to preprocess the depth image simply. [0003] In the prior art, the 3D reconstruction based on RGB-D images mainly includes: Newcombe et al. directly calculate the 3D coordinates of the spatial points through the preprocessed depth image, and...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/50G06T7/70G06T17/00G06K9/62G06K9/46G06K9/34
CPCG06T7/50G06T7/70G06T17/00G06T2207/10028G06V10/267G06V10/44G06V10/757
Inventor 郭红星卢涛汤俊良熊豆孙伟平夏涛范晔斌
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products