Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Deep network model constructing method, and facial expression identification method and system

A technology of facial expression recognition and deep network, which is applied in the construction of deep network model and the field of facial expression recognition, can solve the problems that the performance effect cannot meet the practical application and other problems

Inactive Publication Date: 2018-02-02
INST OF ACOUSTICS CHINESE ACAD OF SCI
View PDF4 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Based on the above, applying the traditional "manual" method of determining feature descriptors in the field of face analysis, or directly applying convolutional neural networks (including other existing deep learning methods), their performance and final effect cannot meet the actual application needs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep network model constructing method, and facial expression identification method and system
  • Deep network model constructing method, and facial expression identification method and system
  • Deep network model constructing method, and facial expression identification method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] The present invention will be further described below in conjunction with the drawings and specific embodiments.

[0066] A method for constructing a deep network model, the method comprising:

[0067] Step S1) Establish a deep network model for facial expression recognition, and initialize the parameters of the deep network model;

[0068] Such as figure 1 As shown, the deep network model includes: a convolutional neural network for extracting high-level features of a picture, a reconstruction network for extracting low-level features of a picture, and a joint decision network for judging facial expressions;

[0069] The step S1) specifically includes:

[0070] Step S1-1) Use 3 layers of convolutional layer C 1 , C 2 And C 3 , And the 3 downsampling layer S 1 , S 2 And S 3 The combination of to build a convolutional neural network, using full connections between layers; initialize the parameter set {CS} in the convolutional neural network;

[0071] Step S1-2) Establish a recons...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep network model constructing method, which comprises the steps of: step S1) establishing a deep network model used for facial expression identification, and initializing parameters of the deep network model, wherein the deep network model comprises a convolutional neural network used for extracting high-level features of pictures, a reconstructed network used for extracting low-level features of the pictures and a joint decision network used for identifying facial expressions; step S2) dividing all training pictures into N groups; step S3) sequentially inputting each group of the pictures into the deep network model, and training parameters in the deep network model based on a gradient descent method; step S4) regarding the parameters of the deep network modelobtained in the step S3) as initial values of model parameters, and re-dividing all the training pictures into N groups, then jumping to the step S3), and carrying out the process repeatedly until allthe trained model parameters no longer change when compared to the initial values of the model parameters. The invention further discloses a facial expression identification method and a facial expression identification system.

Description

Technical field [0001] The invention relates to the technical field of computer vision and deep learning, in particular to a method for constructing a deep network model and a method and system for facial expression recognition. Background technique [0002] As an object that is very difficult for computers to recognize and analyze, human face has attracted wide attention of researchers since the 1990s. The successful and effective face analysis has huge application prospects in the fields of intelligent surveillance, video indexing and demographic information statistics. [0003] The existing related research in the field of face analysis is based on human "manual" determination of feature descriptors to represent human faces, combined with classifier algorithms or regression algorithms. Manual selection of features often consumes a lot of preliminary preparation time and is subjective, and the selected features often perform well on a certain type of data, but when extended to o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/175G06V40/172G06N3/045G06F18/2414
Inventor 刘鹏李松斌
Owner INST OF ACOUSTICS CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products