Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual face modeling method based on real face image

A technology of real people and faces, applied in the field of virtual face modeling based on real face images, which can solve the problems of high facial difference and long modeling time, and achieve the effect of low cost and short modeling time

Active Publication Date: 2019-08-30
CHONGQING UNIV
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the purpose of the present invention is to provide a kind of virtual face modeling method based on real human face image, to solve the problem of using 3D software to build the face of virtual character head model in the prior art, there is a difference between the built virtual face and the real person. Technical problems such as high degree of face difference and long modeling time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual face modeling method based on real face image
  • Virtual face modeling method based on real face image
  • Virtual face modeling method based on real face image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0033] The present embodiment is based on the virtual face modeling method of real human face image, comprises the following steps:

[0034] One) making head model, it comprises the following steps again:

[0035] a) Collect frontal photos of real people's faces, and use Insight 3D to convert the frontal photos of real people's faces into 3D head models;

[0036] b) In Insight 3D, separate the human face part of the 3D head model from the rest of the 3D head model, and smooth the separated human face part to make an ellipsoid; the specific details of this step in Insight 3D The operation process is as follows: first select the head model, convert the model into an editable polygon, then select the face part of the head model, and select the separation option in the option of editing geometry, so that the face can be easily separated from the rest of th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a virtual face modeling method based on a real face image. The method comprises the following steps: 1) manufacturing a head model; 2) making a facial expression action chartlet set; 3) in the Unity 3D, using a frame animation machine built in the Unity 3D to make various obtained expression action chartlet sets into corresponding expression animations; 4) manufacturing anexpression animation controller; 5) decorating the hair accessory on the 3D head model to modify the face; 6) writing a script to control the displacement of the 3D head model except the face, so thatthe 3D head model does corresponding movement along with the face displacement. According to the method, the face of the established 3D virtual head model is modified by adopting the facial expression image of the real person, so that the recognition degree of the face of the established virtual character head model and the face of the real person is extremely high; and the modeling process doesnot need to carry out complex space design like existing 3D software modeling, the modeling time is shorter, and the cost is lower.

Description

technical field [0001] The invention relates to the technical field of three-dimensional modeling, in particular to a virtual face modeling method based on real face images. Background technique [0002] With the continuous development of science and technology, people have a higher pursuit of quality of life, and the emergence of communication robots makes people's eyes shine. The rise of VR technology makes it more experiential and technological. After wearing VR equipment, the experiencer can interact with virtual robots in a computer-generated simulation environment. The future market of virtual robots has begun to infiltrate into the host industry and welcome industry. It is expected that in the near future, virtual robots will be widely used in various service industries. [0003] Unity 3D is the most widely used platform to support virtual interaction. In order to make the interaction more realistic, the humanoid performance of the character model imported into Unity...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06K9/00
CPCG06T17/00G06V40/176G06V40/174
Inventor 宋永端沈志熙刘鹏曾海林
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products