Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Virtual-reality occlusion handling method based on depth image data flow

A technology of virtual and real occlusion and processing methods, which is applied in image data processing, 3D image processing, image enhancement, etc., and can solve problems such as time-consuming, lack of 3D information of real scenes, and difficult to deal with the occlusion relationship between virtual objects and real scenes. Achieving improved accuracy and robustness

Active Publication Date: 2017-10-24
QINGDAO RES INST OF BEIHANG UNIV +1
View PDF5 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Some of the above methods are time-consuming, such as the extraction of feature points, the optimization of energy equations and other steps, the lack of three-dimensional information of the real scene, it is difficult to deal with the occlusion relationship between virtual objects and real scenes, and the fusion of virtual and real lacks realism

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual-reality occlusion handling method based on depth image data flow
  • Virtual-reality occlusion handling method based on depth image data flow
  • Virtual-reality occlusion handling method based on depth image data flow

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] Embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0024] Such as figure 1 As shown, the implementation process of the present invention is mainly divided into four steps: depth data preprocessing, building a 3D point cloud model of a scene, 3D space registration, and virtual-real fusion rendering.

[0025] Step 1. In-depth data preprocessing

[0026] Its main steps are:

[0027] (11) For the depth data in the given input RGBD (color + depth) data stream, set the threshold w according to the error range of the depth camera min ,w max , the depth value is at w min with w max The points between are regarded as credible values, and only the depth data I within the threshold range are kept.

[0028] (12) Perform fast bilateral filtering on each pixel of the depth data, as follows:

[0029]

[0030] where p j for pixel p i The pixels in the neighborhood of , s is the number of effective pixels i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a virtual-reality occlusion handling method based on depth image data flow. The virtual-reality occlusion handling method comprises three parts of construction of a scene point cloud model, three-dimensional space registration and virtual-reality occlusion handling and rendering. Firstly filtering and other processing operation are performed on depth data acquired by a depth camera and the normal vector of each point is calculated; then the camera attitude is calculated by using an iterative closest point algorithm according to the point cloud carrying the normal vector and the point cloud obtained by projection from a three-dimensional scene model through the last frame of camera attitude; then the point cloud of the current frame is fused into the three-dimensional scene point cloud model; when the scene is reconstructed, the color image feature points acquired by the depth camera are calculated in real time and three-dimensional space registration is performed by matching with the template image feature points; and then the space position relation and the occlusion relation of the virtual object and the three-dimensional scene are processed by the obtained camera attitude and rendered in real time. The method can be operated in real time on the present mainstream equipment, and the great virtual-reality occlusion effect can also be obtained when the resolution of the input data is low.

Description

technical field [0001] The invention belongs to the field of computer vision and computer graphics and image processing, and specifically relates to a virtual-real occlusion processing method based on a depth image data stream, which can be used when the resolution of the input data is low and there are holes and noise in the depth data. Estimate the camera pose and reconstruct the point cloud model of the 3D scene in real time, and process the occlusion relationship between the virtual object and the 3D scene in real time according to the camera pose, and fuse the occlusion processing result with the color image to achieve a virtual-real fusion effect , which is of great significance to the research of 3D reconstruction system and real-time augmented reality (AR) technology. Background technique [0002] Augmented reality is a technology that superimposes virtual objects into the real environment to achieve virtual and real fusion effects. In recent years, it has become a r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T19/00G06T7/80G06T5/00G06T15/00G06T5/50G06K9/62
CPCG06T5/50G06T7/80G06T15/005G06T19/006G06T2207/20221G06T2207/20028G06T2207/10028G06T5/73G06T5/70
Inventor 齐越郭小志
Owner QINGDAO RES INST OF BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products