Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Quickly-constructed human face action unit recognition method and system

A face action and recognition method technology, applied in character and pattern recognition, instruments, calculations, etc., can solve the problems of high sample quality and quantity requirements, long parameter adjustment time, low accuracy, etc., and achieve the speed of calculation. The effect of fast, short parameter adjustment time, and low sample quality and quantity requirements

Active Publication Date: 2020-04-10
JILIN UNIV
View PDF12 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, at present, the method of judging facial expressions by identifying AU motion units in facial images is relatively common in the industry. However, the existing methods for identifying AU motion units in the industry need to collect a large number of AU samples, and the requirements for the quality and quantity of samples in the sample library Higher, long parameter tuning time, and low accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Quickly-constructed human face action unit recognition method and system
  • Quickly-constructed human face action unit recognition method and system
  • Quickly-constructed human face action unit recognition method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0065] The face action unit recognition method of the rapid construction of the present embodiment comprises the following steps:

[0066] Inputting the face image to be recognized into the AU motion unit recognition model for identifying the AU motion unit of the face image to be recognized, wherein the construction process of the AU motion unit recognition model includes:

[0067] S1. Generate a sample neutral frame based on the face image of a certain sample with a neutral expression within a preset time period in the sample library, such as within 1S, specifically based on the face of a certain sample with a neutral expression within a preset time period in the sample library Each neutral frame of the image determines the median of each neutral frame; according to the median, the neutral frame of the sample is generated as the sample reference face image and accordingly as the reference metric vector of the sample face image.

[0068] S2. Collect the key points of the samp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a quickly-constructed human face action unit recognition method and system. A relative distance between two corresponding key points is generated according to the reference distance between any two key points on a sample reference face, the expression distance between the two corresponding key points on an expression face and a relative relationship I between the referencedistance and the expression distance; relative angles among corresponding three key points according to the reference angles among any three key points on the reference face, the expression angles among the corresponding three key points on the expression face and a relative relationship II among the reference angles and the expression angles; therefore, the difference of AU motion amplitudes among individuals is cancelled; the difference caused by the appearance of people is balanced; on the basis of ensuring the natural appearance features of an identified person, facial action units presented in various expressions of the recognized person are recognized; subsequent AU recognition is guaranteed to be conducted accurately; construction of a feature library is proposed for the first timeaccording to the distance and angle features among face key points; the requirements for the quality and number of samples in a sample library are low; the calculation speed is high; and the parameter adjustment time is short.

Description

technical field [0001] The present invention relates to the technical field of computer vision processing, in particular to a method and system for quickly constructing human face action unit recognition. Background technique [0002] Facial emotion recognition is an important part of human-computer interaction and affective computing research, involving cognitive science, anthropology, psychology, computer science and other research fields, and is of great significance to the intelligent and harmonious human-computer interaction. [0003] Since Action Units (AU) are the basic unit for describing facial muscle movements, different AUs are combined to form different facial expressions. Therefore, at present, the method of judging facial expressions by identifying AU motion units in facial images is relatively common in the industry. However, the existing methods for identifying AU motion units in the industry need to collect a large number of AU samples, and the requirements ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/168G06V40/176G06F18/22G06F18/214Y02T10/40
Inventor 李显生马佳磊任园园郑雪莲王杰
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products