Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional model reconstruction method and system

A 3D model and model technology, applied in 3D modeling, image data processing, instruments, etc., to achieve the effect of improving accuracy, reducing data redundancy, and reducing motion blur

Inactive Publication Date: 2015-05-06
SHENZHEN ORBBEC CO LTD
View PDF8 Cites 94 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the defect that the target must be manually marked in the existing 3D model reconstruction technology, the present invention provides a 3D model reconstruction method and system, so as to obtain an accurate 3D model of the target without manual marking of the target

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional model reconstruction method and system
  • Three-dimensional model reconstruction method and system
  • Three-dimensional model reconstruction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The present invention aims at the defect that in the existing 3D model reconstruction process, the key identification points must be manually marked by the user to create a 3D model, the operation is inconvenient, and the accuracy is low. The method of reconstruction enables image splicing and fusion without the need for the user to manually select key recognition points in the image, thereby achieving the purpose of improving the accuracy and speed of obtaining the 3D model.

[0051] The invention will now be described in detail in conjunction with the accompanying drawings and specific embodiments.

[0052] Such as figure 1 Shown is a flowchart of a three-dimensional model reconstruction method provided by a preferred embodiment of the present invention. In this embodiment, step S1 is first performed: using at least one depth camera to conduct multi-angle and continuous acquisition of a target to be modeled to generate multiple depth maps. The depth camera used in t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a three-dimensional model reconstruction method and system. The method comprises the following steps: S1, performing image acqusition on a target by using at least one depth camera to acquire a depth image of the target; S2, preprocessing the acquired depth image; S3, acquiring dense point cloud data according to the depth image of the target to reconstruct a target depth information point cloud grid; S4, combining and registering the multiple frames of reconstructed depth images to obtain a three-dimensional model. By implementing the method and the system, the accurate three-dimensional model of the target can be acquired without manually marking the target.

Description

technical field [0001] The present invention relates to computer vision technology, and more specifically, to a method and system for reconstructing a three-dimensional model based on a depth camera. Background technique [0002] The use of computer technology to build models of real objects is of great significance in various fields. For example, the reconstruction of the 3D model of the human body can realize the reconstruction of the posture, action, and shape characteristics of the human body in the computer. It provides the basis for applications such as gesture recognition and changing shape characteristics. [0003] However, in the existing 3D model reconstruction process, the user must manually mark key identification points to create a 3D model, which is inconvenient to operate and has low precision. Contents of the invention [0004] Aiming at the defect that the target must be manually marked in the existing 3D model reconstruction technology, the present inven...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00G06T7/00
CPCG06T7/00G06T17/00
Inventor 肖振中许宏淮刘龙黄源浩
Owner SHENZHEN ORBBEC CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products