Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Scene recovery method and device based on low-quality GRB-D data

A GRB-D, low-quality technology, applied in the field of 3D modeling, can solve the problem of high accuracy requirements of manual interactive acquisition equipment, and achieve the effect of improving the accuracy rate

Active Publication Date: 2016-04-06
TSINGHUA UNIV
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the defects in the prior art that require a large amount of manual interaction and high accuracy requirements for acquisition equipment, the present invention provides a scene recovery method and device based on low-quality GRB-D data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scene recovery method and device based on low-quality GRB-D data
  • Scene recovery method and device based on low-quality GRB-D data
  • Scene recovery method and device based on low-quality GRB-D data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The technical solution of the present invention will be further described in detail in conjunction with the accompanying drawings and embodiments.

[0043] figure 1 A schematic flow diagram of a scene recovery method based on low-quality GRB-D data in this embodiment is shown, as shown in figure 1 As shown, the scene recovery method based on low-quality GRB-D data provided by this embodiment includes:

[0044] S1. Acquire RGB-D images, and stitch the RGB-D images to obtain scene point clouds after over-segmentation and stitching.

[0045] For example, for a group of RGB-D images of a target scene, it includes both RGB (color) image information and Depth (depth) image information. According to the Depth image information of the image, the group of images can be over-segmented by computer Stitching into a complete scene point cloud.

[0046] S2. Match the scene point cloud with the model in the model library according to the semantic relationship and the point cloud cl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a scene recovery method and device based on low-quality GRB-D data, and belongs to the technical field of 3D modeling. According to the method and device, a main object model of a scene is recovered according to the semantic relation and a point cloud classifier, the contour of small objects is accurately extracted from a corresponding color picture, and the small objects are recovered by means of a contour retrieval method. Thus, the accuracy in recovering the 3D model from a low-quality RGB-D image sequence is greatly improved, and the virtual 3D scene model which is accurate semantically and vivid visually is automatically recovered on the premise that no artificial intervention is needed.

Description

technical field [0001] The invention relates to the technical field of three-dimensional modeling, in particular to a scene restoration method and device based on low-quality GRB-D data. Background technique [0002] With the increase in the number of 3D models on the Internet (such as Google3DWarehouse and other model libraries) and the development of model retrieval technology, even ordinary people without any professional technical background can build indoor 3D scene models by searching and placing models. Especially after the popularization of consumer-grade depth cameras (such as Microsoft Kinect), the cost of point cloud acquisition in indoor scenes is getting lower and lower, and the demand for digitally constructing virtual indoor scenes is becoming more and more vigorous. [0003] The goal of the traditional 3D reconstruction method is to accurately restore the geometric structure of the object; while the method of matching the point cloud through the virtual 3D mo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06T17/00
Inventor 胡事民陈康吴育昕
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products