Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A facial expression editing method based on generative confrontation network

A face expression and generative technology, applied in the field of computer vision, can solve problems such as difficult to realize face texture modification, and achieve the effect of maintaining face identity information

Active Publication Date: 2021-04-06
SEETATECH BEIJING TECH CO LTD
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, the facial expression editing method based on the 3D face model is difficult to modify the texture of the face

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A facial expression editing method based on generative confrontation network
  • A facial expression editing method based on generative confrontation network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023]The present invention will be described in further detail below with reference to the accompanying drawings and specific embodiments.

[0024]A human face expression editing method based on a generated counterfeit network, its overall steps are:

[0025]Step S1, the data preparation phase

[0026]A. Manual marking in each of the RGB image collection, labeled out face identity information and face expression information; the annotation information of each picture is expressed in [I, J], i represents the picture belong to i Personal (0 ≤ i

[0027]b, cut out the face of the image collected from the face detector and the face feature point detector cut out from the picture, and do a face alignment;

[0028]Step S2, the model design phase

[0029]A, the model consists of two parts, which are generators G and the discriminator D; where the generator G is used to generate ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for editing human facial expressions based on a generative confrontation network. The overall steps are as follows: in the data preparation stage, human face images are manually marked and cut out; in the model design stage, a model is generated by a generator and a discriminator ; In the model training phase, the real labeled face pictures and the pictures generated by the generator are input into the discriminator, and the discriminator is trained to distinguish the distribution of real samples and generated samples, and learn the distribution of facial expression and face identity information ; Then input the face picture to be edited and the expression control vector into the generator, and output the face picture controlled by the expression control vector; then do real training on the trained discriminator; repeat the above steps to complete the construction of the model; The built model is tested. The invention can ensure that the generator generates a face picture that is closer to the real face picture distribution, better maintains the face identity information, and has more effective expression editing.

Description

Technical field[0001]The present invention relates to an editing method, and more particularly to a human face expression editing method based on a generated counterfeit network, belonging to computer vision technology.Background technique[0002]Human face expression editor requires a wide application in the field of face animation, social software, face recognition data set, while maintaining face animation, social software, face recognition data set. The current human face expression editorial method is based on the three-dimensional faceable model. The representative method is: a patent number 201310451508.9 is a human face expression editing method based on single-camera and motion capture data, its main technical means: Utilization The photo generates the user's three-dimensional face model, and the three-dimensional face model is decoupled, separated by identity and expression; then controls the new human face three-dimensional model by controlling the human face expression com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T11/00G06N3/04
CPCG06T11/001G06N3/045
Inventor 张刚韩琥张杰山世光陈熙霖
Owner SEETATECH BEIJING TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products