Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Two-dimensional face key feature point positioning method and system

A key feature and feature point technology, which is applied in the field of two-dimensional face key feature point positioning method and system, can solve the problems of low accuracy of face key feature point positioning and the inability to provide face key feature point positioning methods, etc.

Inactive Publication Date: 2015-09-09
SHENZHEN UNIV
View PDF4 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the embodiments of the present invention is to provide a method and system for locating key feature points of a two-dimensional human face, aiming at solving the problem that the positioning of key feature points of a human face is relatively accurate due to the inability of the prior art to provide a method for locating key feature points of a human face. low problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-dimensional face key feature point positioning method and system
  • Two-dimensional face key feature point positioning method and system
  • Two-dimensional face key feature point positioning method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] figure 2 The implementation process of the two-dimensional face key feature point positioning method provided by Embodiment 1 of the present invention is shown, and the details are as follows:

[0030] In step S201, the input two-dimensional face image is received, the two-dimensional position coordinates of the key feature points of the face are obtained using the preset key feature point acquisition algorithm, and the two-dimensional position coordinates are input into the pre-established feature point dimension Convert the model.

[0031] In the embodiment of the present invention, first, the two-dimensional position coordinates of the key feature points of the face are obtained by using the preset key feature point acquisition algorithm of the face, that is, the initial two-dimensional position coordinates of the key feature points of the face, wherein the key feature points of the face The point acquisition algorithm can be an active shape model (Active Shape Mod...

Embodiment 2

[0043] image 3 The implementation flow of the method for establishing a voice conference site provided by Embodiment 2 of the present invention is shown, and the details are as follows:

[0044] In step S301, the input two-dimensional face image is received, the two-dimensional position coordinates of the key feature points of the face are obtained using the preset key feature point acquisition algorithm, and the two-dimensional position coordinates are input into the pre-established feature point dimension Convert the model.

[0045] In step S302, the three-dimensional position coordinates of the key feature points of the face corresponding to the two-dimensional position coordinates are calculated through the feature point dimension conversion model.

[0046] In step S303, the coordinates of the three-dimensional position are projected onto the two-dimensional face image, and the estimated coordinates of the three-dimensional position on the two-dimensional face image are ...

Embodiment 3

[0057] Figure 4 The structure of the two-dimensional face key feature point positioning system provided by the third embodiment of the present invention is shown. For the convenience of description, only the parts related to the embodiment of the present invention are shown, including:

[0058] The key point two-dimensional coordinate acquisition unit 41 is used to receive the input two-dimensional face image, use the preset human face key feature point acquisition algorithm to obtain the two-dimensional position coordinates of the key feature points of the face, and convert the two-dimensional position coordinates to Input to the pre-established feature point dimension conversion model;

[0059] A key point three-dimensional coordinate calculation unit 42, configured to calculate the three-dimensional position coordinates of the key feature points of the face corresponding to the two-dimensional position coordinates through the feature point dimension conversion model;

[0...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention is suitable for the technical field of image processing, and provides a two-dimensional face key feature point positioning method and system. The method comprises the following steps: receiving an input two-dimensional face image, using a preset face key feature point acquisition algorithm to obtain the two-dimensional position coordinates of face key feature points, inputting the two-dimensional position coordinates into a feature point dimensionality conversion model which is established in advance; calculating the three-dimensional position coordinates of the face key feature points corresponding to the two-dimensional position coordinates through the feature point dimensionality conversion model; projecting the two-dimensional position coordinates onto the two-dimensional face image, and obtaining the estimation coordinates of the three-dimensional positions on the two-dimensional face image; through the estimation coordinates and the three-dimensional position coordinates, calculating the positioning precision of the face key feature points; and when the positioning precision is smaller than a preset threshold value, setting the corresponding pixels of the two-dimensional position coordinates in the two-dimensional face image as the face key feature points of the two-dimensional face image so as to improve the positioning precision of the face key feature points.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a method and system for locating key feature points of a two-dimensional human face. Background technique [0002] At present, in most human-computer interaction applications, it is necessary to locate the face, and then perform further analysis applications such as face recognition, expression analysis, and age estimation. When locating a face, it is necessary to locate the key feature points of the face in the image, that is, to determine the positions of the corners of the eyes, the center of the eyes, the eyebrows, the nose, the corners of the mouth, etc. on the face. [0003] From the type of image, there are positioning methods based on two-dimensional face images (brightness images) and three-dimensional face images (depth images) for the location of key feature points of faces. The pixels of a two-dimensional face image represent brightness. For exam...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/168
Inventor 于仕琪李立汪青
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products