Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Facial expression analysis method based on capsule network

An analysis method and facial expression technology, applied in the direction of neural learning method, biological neural network model, acquisition/recognition of facial features, etc., can solve the problem of increasing the workload of manual marking, emotional analysis of difficult facial images, noise interference of facial expressions, etc. problem, to achieve the effect of improving the classification effect, high accuracy of expression recognition, and not easy to interfere with noise

Pending Publication Date: 2021-06-22
苏州元启创人工智能科技有限公司
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current capsule network technology is susceptible to noise interference caused by features irrelevant to facial expressions, which increases the workload of manual labeling, making it difficult to perform emotional analysis on face images with complex backgrounds, and the accuracy of expression recognition is not high.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Facial expression analysis method based on capsule network
  • Facial expression analysis method based on capsule network
  • Facial expression analysis method based on capsule network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The present invention will be further described in detail below in combination with test examples and specific embodiments. However, it should not be understood that the scope of the above subject matter of the present invention is limited to the following embodiments, and all technologies realized based on the content of the present invention belong to the scope of this aspect.

[0047] The implementation steps of a method for emotional analysis of human face images based on capsule networks provided by the present invention, the neural network diagram is as follows figure 1 shown.

[0048] exist figure 1 Among them is the flow chart of the network model of the facial image emotion analysis network based on the capsule network in the present invention. In an example of the present invention, the model is first trained on the augmented data set, and the size of all inputs is 32×32×3. The input is passed through three convolutional layers to extract features, and the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a facial expression analysis method based on a capsule network, and the method employs a multi-channel parallel technology, a Strict-Squash function, a Lane-Filter function, and DropCircuit technology to improve an original capsule network, avoids noise interference caused by the irrelevant features of facial expressions, reduces the workload of manual marking, and improves the accuracy of facial expression analysis. Therefore, sentiment analysis can be carried out on a face image with a complex background, and the expression recognition precision is improved. The method is innovated on the basis of the capsule network, can effectively learn the spatial relationship between objects, has a strong recognition capability for slightly deformed objects and pictures of the same object at different visual angles, is not liable to be interfered by noise caused by irrelevant features of facial expressions, and has a good recognition effect on the objects at different visual angles. The method can carry out correct sentiment analysis on a face image with a complex background, is high in expression recognition precision, is wide in application range, and is high in robustness.

Description

technical field [0001] The invention relates to the fields of facial expression recognition, emotion analysis and deep learning. Background technique [0002] When collecting face images, it is inevitable to introduce noise such as lighting and background; and there are individual differences in the face samples themselves, specifically in age, gender, skin color, fat and thin, whether to wear glasses, etc. These noises and differences will weaken the characteristics of expression information Expression, reducing the accuracy of expression recognition, directly inputting original pictures to extract expression features will cause a great negative effect on the recognition results. Secondly, the existing facial expression database has a small amount of data, which is difficult to meet the training needs of the deep network, and the obtained deep model is prone to overfitting problems. [0003] Facial expression recognition refers to the analysis of a given face image or imag...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/082G06V40/168G06V40/174G06N3/045G06F18/2135
Inventor 何慧华
Owner 苏州元启创人工智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products