Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth map and IMU-based high-dynamic scene three-dimensional reconstruction method and system

A 3D reconstruction and high dynamic technology, which is applied in the field of computer vision and 3D reconstruction, and can solve the problem that mobile devices cannot realize 3D reconstruction of high dynamic scenes.

Inactive Publication Date: 2019-10-08
INST OF AUTOMATION CHINESE ACAD OF SCI +1
View PDF4 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above-mentioned problems in the prior art, that is, the problem that mobile devices cannot realize 3D reconstruction of high dynamic scenes, the present invention provides a method for 3D reconstruction of high dynamic scenes based on depth maps and IMUs, including:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth map and IMU-based high-dynamic scene three-dimensional reconstruction method and system
  • Depth map and IMU-based high-dynamic scene three-dimensional reconstruction method and system
  • Depth map and IMU-based high-dynamic scene three-dimensional reconstruction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0070] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0071] A method for three-dimensional reconstruction of a highly dynamic scene based on a depth map and an IMU of the present invention includes:

[0072] Step S10, acquiring the high dynamic scene depth image of the current frame as the first image; the RGB image corresponding to t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision and three-dimensional reconstruction, particularly relates to a depth map and IMU-based high-dynamic scene three-dimensional reconstruction methodand system, which aims to solve the problem that the mobile equipment cannot realize the high-dynamic scene three-dimensional reconstruction. The method comprises the following steps of converting anacquired current frame depth map; carrying out image background segmentation by combining the rotation degree matrix after IMU data integration; performing the current camera attitude tracking basedon the camera attitude of the previous frame and the background segmentation result of the current frame; performing the volume data fusion according to the current camera posture and the image; and finally, performing three-dimensional rendering according to the volume data information to obtain a high-dynamic scene three-dimensional model. According to the method, the dynamic / static segmentationcan be efficiently carried out by means of the color information, the depth information and the IMU information, the dynamic voxels in the model are eliminated, and the rapid robust three-dimensionalreconstruction on a mobile device of a scene containing a dynamic object can be achieved.

Description

technical field [0001] The invention belongs to the field of computer vision and three-dimensional reconstruction, and in particular relates to a method and system for three-dimensional reconstruction of a high dynamic scene based on a depth map and an IMU. Background technique [0002] 3D scanning of indoor scenes is a key technology for robotics and augmented reality. In recent years, with the development of depth sensors such as Microsoft Kinect and Intel RealSense, 3D scanning technology has made great progress. The depth and color maps collected by these sensors can be conveniently used to generate dense 3D models of scanned objects. The popularity of mobile devices such as mobile phones and tablet computers makes 3D scanning on mobile devices have great application prospects. The emergence of devices such as Google Tango and Occipital Structure Sensor has provided the feasibility of dense 3D scanning on mobile devices. [0003] Among these 3D reconstruction methods,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00G06T19/20G06T7/33G06K9/62
CPCG06T17/00G06T19/20G06T7/33G06T2207/10028G06T2219/2016G06F18/23213
Inventor 高伟刘养东胡占义王家彬梁宵月
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products