Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Three-dimensional head modeling method based on two images

A head model, three-dimensional technology, which is applied in the field of three-dimensional head shape and texture modeling based on two front and side images, can solve the problems of immature three-dimensional head modeling technology, reduce the amount of manual interaction, and can be widely used Prospects, simple and practical effects

Active Publication Date: 2012-04-25
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF7 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] To sum up, the existing 3D head modeling technology is far from mature, and it is not possible to restore the 3D face model and its details from a given small number of images without or with only a small amount of manual interaction.
How to fully reconstruct the shape and details of an arbitrary user's head from no more than two images without prior training samples remains a challenging problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional head modeling method based on two images
  • Three-dimensional head modeling method based on two images
  • Three-dimensional head modeling method based on two images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be described in further detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0024] Such as figure 1 As shown, the present invention provides a kind of three-dimensional head modeling method based on two images, specifically comprises the following steps:

[0025] Such as figure 1 As shown, the input data of the present invention are two front and side face images and a general three-dimensional head model. The front and side face images can be taken by the user by using a network camera, or obtained by collecting daily digital photos. The general three-dimensional head model is composed of 229 three-dimensional vertices, and the subsequent deformation process starts from this model.

[0026] Step S1: Adjust the 3D position and attitude angle of the 3D head model so that it is approximately con...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a modeling method of restoring a user three-dimensional face surface model from two images. The method comprises the following steps of: interactively adjusting the positions of pre-defined control points and matching with facial features to get an initial corresponding point set; stratifying control points according to semantics, and carrying out interpolation calculation on the deformation of all non-control vertexes from the given initial point set by utilizing a moving least square algorithm based on three-dimensional rigid changes of vertexes to get a continuous and smooth surface of a realistic three-dimensional model; calculating coordinates of enveloping cylindrical textures of the deformed model, extracting color values from an input human face image, synthesizing a texture image and sticking the texture image to the surface of the model; and generating a three-dimensional head model with the real textures. According to the modeling method disclosed by the invention, the difficulty of needing to input multiple images or needing to perform excessive complex manual interaction operation of the traditional three-dimensional modeling technology is solved, and modeling of the three-dimensional head real model based on the front and the side images with ordinary resolution can be realized.

Description

technical field [0001] The invention belongs to the technical fields of computer vision and computer graphics, and in particular relates to a technology of three-dimensional head shape and texture modeling based on two front and side images. Background technique [0002] A large number of two-dimensional face images can be acquired by using a digital camera to shoot at different angles. If the three-dimensional face model can be recovered from these images, it will be widely used in fields such as virtual video conferencing, online role games and 3D movies. However, due to the loss of the three-dimensional structure of the image, there are many technical difficulties in completely reconstructing the three-dimensional face model and structural details from the input image. In the field of computer graphics and computer vision, this is a problem that has been studied for many years. According to the technical means and the number of image samples required, the existing method...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00
Inventor 王海波潘春洪
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products