Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

TOF camera-based hair segmentation method

A TOF camera and hair technology, applied in the field of 3D images, to achieve the effect of high-quality and high-precision segmentation

Active Publication Date: 2019-08-02
PLEX VR DIGITAL TECH CO LTD
View PDF13 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But in our case, the edge of the noise obtained by only shooting the head is very sharp, but the high and low depth fluctuations of the hair part are very obvious

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • TOF camera-based hair segmentation method
  • TOF camera-based hair segmentation method
  • TOF camera-based hair segmentation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The present invention will be further described now in conjunction with accompanying drawing.

[0021] see figure 1 , figure 2 , figure 1 , figure 2 What is shown is an embodiment of the present invention. This embodiment is based on a TOF camera to achieve higher precision segmentation of human hair, and reversely uses the characteristic of time-of-flight to generate a certain degree of noise on the hair to achieve higher precision for hair segmentation.

[0022] First, deep learning achieves preliminary segmentation of hair.

[0023] That is to use the existing data set of a small and not very fine hair color map plus mask to construct a hair style deep learning network, and integrate the important information of hair gradient to train and obtain a preliminary hair segmentation map.

[0024] see figure 1 , this embodiment uses the Figaro 1K hair dataset, which has 1050 pieces of hair data. Perform data enhancement on the data set, add noise to hair, change br...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a TOF camera-based hair segmentation method which comprises the following steps: establishing a deep learning network based on a data set of a hair color map and a mask thereof; obtaining a depth map by using a TOF camera, and obtaining a variance map; optimizing the initial hair mask obtained by deep learning to obtain an optimized hair mask; and optimizing the optimizedhair mask again by using the variance map to obtain the final accurate hair mask. According to the method, the characteristic that a certain degree of noise is generated on hair during flight time isingeniously and reversely utilized, and higher-precision segmentation of the hair is achieved.

Description

technical field [0001] The invention relates to the field of three-dimensional images, in particular to a TOF camera-based hair segmentation method using noise in reverse. Background technique [0002] At present, the hair segmentation method is not very robust, mainly because the segmentation method based on RGB image is very strongly affected by the illumination and background, and the traditional image-based segmentation method cannot complete this task well, especially for high-definition images. Hair segmentation has long been a very difficult problem. [0003] With the advent of the era of artificial intelligence, people began to try to use deep learning methods to segment hair. But a very obvious problem is that there are very few data sets for hair. The main reason is that the hair of different people is ever-changing, and the boundary is very rough, which creates great difficulties in data labeling. Under the premise that the data set labeling is very difficult a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/136G06T7/187
CPCG06T7/136G06T7/187G06T2207/20081Y02T10/40
Inventor 马原曦蒋琪雷李思远张迎梁
Owner PLEX VR DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products