Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An aspect-level emotion classification model and method based on dual-memory attention

An emotion classification and attention technology, applied in biological neural network models, semantic analysis, special data processing applications, etc., can solve the problem of ignoring emotional characteristics

Active Publication Date: 2019-03-15
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF4 Cites 69 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Among the existing solutions, the cyclic neural network model based on the attention mechanism and the multi-layer model based on the attention mechanism perform better. The reason for the better performance of the former is that with the help of the feature abstraction mechanism of the deep learning model, more accurate attention distribution, and the latter uses the attention captured by the previous layer to help the next layer to calculate a more accurate attention distribution. Very important word-level or phrase-level emotional features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An aspect-level emotion classification model and method based on dual-memory attention
  • An aspect-level emotion classification model and method based on dual-memory attention
  • An aspect-level emotion classification model and method based on dual-memory attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0071] This embodiment provides a RNN encoder-decoder sentiment classification model with dual memory attention, which consists of an encoder, two memory modules, a decoder and a classifier. First, the encoder encodes the word vector corresponding to the input sentence to obtain the hidden layer state in the GRU recurrent neural network and the intermediate vector And constitute two memory modules om and em, which respectively store potential word-level and phrase-level features; secondly, the decoder first performs the first decoding stage on em, and then performs the second decoding stage on om, which The aim is to capture phrase-level and word-level features from the two memories, respectively. In particular, the present invention adopts a special feed-forward neural network attention layer, continuously captures the important emotional features ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an aspect-level emotion classification model and method based on dual-memory attention, belonging to the technical field of text emotion classification. The model of the invention mainly comprises three modules: an encoder composed of a standard GRU loop neural network, a GRU loop neural network decoder introducing a feedforward neural network attention layer and a Softmaxclassifier. The model treats input statements as a sequence, based on the attention paid to the position of the aspect-level words in the sentence, Two memory modules are constructed from the originaltext sequence and the hidden layer state of the encoder respectively. The randomly initialized attention distribution is fine-tuned through the attention layer of the feedforward neural network to capture the important emotional features in the sentences, and the encoder-decoder classification model is established based on the learning ability of the GRU loop neural network to the sequence to achieve aspect-level affective classification capabilities. The invention can remarkably improve the robustness of the text emotion classification and improve the classification accuracy.

Description

technical field [0001] The invention belongs to the technical field of text emotion classification, in particular to the technical field of natural language processing, and specifically relates to an aspect-level emotion classification model and method based on a dual-memory attention mechanism and an encoder-decoder structure. Background technique [0002] Sentiment analysis, also known as opinion mining, is a research field that analyzes people's subjective feelings such as opinions, emotions, evaluations, opinions and attitudes towards physical objects such as products, services, organizations, individuals, events, topics and their attributes. Aspect-level sentiment analysis is to analyze the emotional tendency (positive, negative or Neutral), which is a subdivision task of sentiment analysis and one of the fundamental concerns in this field. [0003] Traditional feature representation methods include One-hot, N-Gram, and some effective features designed by domain expert...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/27G06N3/04
CPCG06F40/30G06N3/045
Inventor 刘峤吴培辛曾义夫曾唯智蓝天
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products