An Interactive Depth Map Texture Copy Defect Removal Method

A depth map, interactive technology, applied in the direction of image data processing, image enhancement, image analysis, etc., to achieve the effect of reducing original noise, simple method, and easy to implement

Active Publication Date: 2021-05-25
WUHAN UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the strong luminance response of texture materials, patterns, color edges, and target geometric shape edges, although geometric features and texture edges can be effectively detected on RGB images, it is still challenging to effectively and reliably distinguish them.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An Interactive Depth Map Texture Copy Defect Removal Method
  • An Interactive Depth Map Texture Copy Defect Removal Method
  • An Interactive Depth Map Texture Copy Defect Removal Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be described in detail below in conjunction with the accompanying drawings.

[0028] In RGB image I col The feature boundary of the object in the scene is detected above, and the detected boundary features are marked 1-0 by means of interaction (1 means that the texture copy to be eliminated corresponds to the pseudo boundary, and 0 means the real geometric feature of the target surface), and based on this mark, a new model is designed. The filter pair depth map I dep For filtering, the filter can perform edge-preserving filtering on the real target geometric features, and can perform isotropic filtering on the pseudo-geometric boundaries marked as 1 and gradually remove texture copy defects.

[0029] in:

[0030] 1. Regarding the marking of texture boundary defects on the gradient map: use the intensity map corresponding to the color map to generate the target gradient information, and grow the pixels in the gradient map to obtain various bou...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an interactive depth image texture copy defect removal method. The patent of the invention uses the depth camera to simultaneously collect the color image and the depth image of the target, establish the corresponding relationship between the color image and the depth image, and adjust the brightness of the color image. Edges (geometric edges, texture edges) are detected and identified (connecting edge fragments with a minimum spanning tree), and texture edges and target geometric edges are classified and marked interactively, and a classification based on spatial neighbors and color map boundaries is constructed. The edge-preserving filter operator of the marked depth map uses the edge marking information on the color image to guide the depth map filtering, so as to achieve the purpose of removing texture copy defects in the depth map while retaining the true geometric boundary of the target surface. The present invention constructs an effective method for the Kinect v2 depth camera based on ToF, which can maintain real geometric features and eliminate texture copy-false geometric boundaries, and the method interaction is simple, convenient and easy to implement.

Description

technical field [0001] The invention belongs to the field of computer graphics, and aims to improve the quality of scene depth maps perceived by consumer-level depth cameras, and in particular relates to an interactive method for eliminating texture copy defects (that is, false object boundaries) in depth maps. Background technique [0002] The depth perception measurement of the target scene is the basis of optical 3D scanning, and it also helps to segment and identify different targets in the scene. The consumer-grade depth camera (Kinect v2 based on ToF distance measurement) can obtain the depth map of the scene in a low-cost and convenient way. However, the depth map corresponds to the 3D surface segment with obvious texture copy defects (caused by color, pattern and material edge) . If the captured depth map is directly used for 3D reconstruction, false geometric features will appear on the fused surface, which will reduce the reconstruction quality of the 3D scanned o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/00G06T7/13
CPCG06T5/00G06T5/002G06T2207/20028G06T7/13
Inventor 肖春霞杨龙
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products