Multi-modal sentiment classification method based on attention guidance two-way capsule network

A sentiment classification, multimodal technology, applied in other database clustering/classification, character and pattern recognition, instruments, etc., to promote and enhance bimodal homogeneity and improve learning efficiency

Active Publication Date: 2022-03-01
HANGZHOU DIANZI UNIV
View PDF8 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the above techniques only focus on the spatial relationship between the representation layer and the output layer through bottom-up attention
This indeed ignores the intrinsic contextual relationship between multiple modalities, fails to provide global guidance for each modality, and leads to a locally suboptimal decoupling process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal sentiment classification method based on attention guidance two-way capsule network
  • Multi-modal sentiment classification method based on attention guidance two-way capsule network
  • Multi-modal sentiment classification method based on attention guidance two-way capsule network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The method of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0061] Such as figure 1 and 2 As shown, a multi-modal emotion classification method based on attention-guided bidirectional capsule network, the specific steps are as follows:

[0062] Such as figure 1 As shown, the attention-guided bidirectional capsule network adopted by this method consists of two important components: 1) a multimodal dynamic interaction enhancement module, which is used to enhance cross-modal homogeneity at the feature level; 2) ABCN, which uses for exploring global multimodal public cues. Include the following steps:

[0063] Step 1. Acquire multimodal data

[0064] Multimodal data represents multiple types of modal data: such as audio modality, video modality, and text modality; the purpose of multimodal fusion is to obtain complementarity and consistency between multiple modality data under the same task The two public sent...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-modal sentiment classification method based on an attention guidance two-way capsule network. Due to the fact that the capsule network has the trainable viewpoint invariant transformation characteristic, the effectiveness of the capsule network is proved in the aspect of analyzing heterogeneity of multi-modal learning at present. In the preprocessing stage, a multi-modal dynamic interaction enhancement module is provided, cross-modal homogeneity is explicitly enhanced on the feature level, and the model can effectively execute the multi-modal decoupling process in a more compact local public space. On the basis, a bidirectional capsule network (ABCN) based on attention guidance is provided, and a global multi-mode public message is explored through a new bidirectional dynamic routing mechanism. Secondly, guiding a multi-modal dynamic routing process by utilizing a global multi-modal context, and researching a global optimal common clue of each modal; this greatly improves learning efficiency and provides superior ability to erect a bridge between all modes.

Description

technical field [0001] The invention belongs to the field of multi-modal emotion recognition in the fields of natural language processing, vision, and voice, and relates to a multi-modal emotion classification method based on attention-guided bidirectional capsule network, specifically a capsule network-based learning and attention The attention-guided two-way capsule network technology of the mechanism is a method for two-way exploration, decoupling and fusion of multi-modal information to judge the emotional state of the test subject. Background technique [0002] Multimodal learning has increased interest in artificial intelligence systems, and linguistic, acoustic, and visual modalities are widely used to analyze relevant research tasks. Intuitively, a single modality can analyze task-related information from a specific perspective, while the integration of multiple modalities helps us effectively reason about the complex and comprehensive meaning of multimodal informati...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06F16/906
CPCG06F16/906G06F18/2415Y02D10/00
Inventor 孔万增刘栋军唐佳佳金宣妤
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products