Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Various-information coupling emotion recognition method for human-computer interaction

A technology of emotion recognition and comprehensive information, applied in speech recognition, character and pattern recognition, computer parts and other directions, can solve the problems of high noise impact, poor performance, and inability to comprehensively express the emotional transmission of human conversations.

Active Publication Date: 2014-12-10
山东心法科技有限公司
View PDF5 Cites 63 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The current recognition methods mainly include: the methods for text recognition alone, most of which use models such as TF-IDF to recognize the text’s emotion, most of them need to preprocess the text, and at the same time, the accuracy rate in multilingual and multi-category recognition is low ; For speech recognition methods alone, most of them only use prosodic features or overall features based on spectrum. Among the prosody features, the feature values ​​containing strong emotions are difficult to extract and are greatly affected by noise, while the features based on spectrum are reflected in some alone. The performance of the strong emotion part is poor; in the field of combined multi-information recognition, most of the text and voice, voice and video are combined in pairs, without considering that human interaction is a process in which the three types of information are transmitted and interacted together; therefore, only one of them is analyzed. The two features are unable to fully describe the emotional transmission in human conversation, which leads to inaccurate sentiment analysis; in terms of feature model training, using general machine learning methods, facing large-dimensional and large-scale data, there are gaps in training and prediction. difficulty

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Various-information coupling emotion recognition method for human-computer interaction
  • Various-information coupling emotion recognition method for human-computer interaction
  • Various-information coupling emotion recognition method for human-computer interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] In this embodiment, a human-computer interaction-oriented multi-(class) information coupling emotion recognition method based on deep learning includes the following steps:

[0071] Step 1. Use the camera device and the microphone to obtain video data and voice data of human facial expressions synchronously. The video data is required to shoot the speaker's face; the collected video is classified into emotions; specifically, it is divided into angry (angry), Fear (fear), happiness (happy), neutral (neutral), sadness (sad) and surprise (surprise) are six types of emotions, and they are represented by 1, 2, 3, 4, 5, and 6 respectively. The comprehensive emotional features of each video can be represented by a quaternion Y.

[0072] Y=(E,V T ,V S ,V i )(1)

[0073] In formula (1), E represents the emotion classification of this video, V T Represents the first information feature, that is, the text information feature (TextFeature), V S Indicates that the second infor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a various-information coupling emotion recognition method for the human-computer interaction. The method is characterized by including the steps of 1, acquiring the video and audio data of facial expression; 2, extracting features of text content, and acquiring the text information features; 3, extracting and coupling the prosodic features and overall audio features of the audio data; 4, coupling the text information features, audio information features and expression information features, and acquiring the comprehensive information features; 5, performing data optimization on the comprehensive information features by the deep learning method, utilizing a classifier to train the optimized comprehensive information features, and acquiring an emotion recognition model for various information coupling emotion recognition. According to the method, data information of text, audio and video can be combined completely, and the accuracy of emotion state judgment in human-computer interaction can be improved accordingly.

Description

technical field [0001] The invention belongs to the field of natural language processing technology and emotional computing, in particular, it is a deep learning human-computer interaction multi-information emotional analysis method. Background technique [0002] Affective computing refers to the ability for machines to recognize and understand human emotions. Information forms such as text, speech, and images used by humans to express emotions contain feature values ​​that can represent emotions. By extracting these eigenvalues ​​and using machine learning methods, let the machine learn the emotional information contained in the eigenvalues ​​by itself, that is, let the machine understand human emotions. [0003] The current recognition methods mainly include: the methods for text recognition alone, most of which use models such as TF-IDF for text emotion recognition, most of which require preprocessing of the text, and at the same time, the accuracy rate in multilingual a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G10L15/02G10L15/26G10L25/63G06K9/00G06K9/46
Inventor 孙晓陈炜亮李承程任福继
Owner 山东心法科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products