Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural structure corresponding learning cross-domain emotion classification method for improving feature selection

A feature selection and emotion classification technology, applied in special data processing applications, instruments, electrical digital data processing, etc., can solve the feature redundancy without considering the importance of pivot feature text, affect the result of feature migration, and insufficient pivot features Reasonable and other issues to achieve the effect of reducing noise interference, improving accuracy, and reducing inter-domain differences

Active Publication Date: 2019-11-22
KUNMING UNIV OF SCI & TECH
View PDF12 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the traditional method calculates the pivot feature, the method of screening the pivot feature only uses the mutual information value MI of the feature. This method only considers the degree of association between the pivot feature and the classification label, and does not consider the pivot feature for the text. The degree of importance and feature redundancy, so the selected pivot features are not reasonable enough, which affects the feature migration results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural structure corresponding learning cross-domain emotion classification method for improving feature selection
  • Neural structure corresponding learning cross-domain emotion classification method for improving feature selection
  • Neural structure corresponding learning cross-domain emotion classification method for improving feature selection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] Embodiment 1: as Figure 1-5 As shown, the neural structure of improved feature selection corresponds to learning a cross-domain emotion classification method, and the specific steps of the classification method are as follows:

[0036] Step1, using the Amazon product review dataset. Select two different domains as source domain and target domain data respectively. For the dataset source domain D s A small number of labeled samples of and the source domain D s , target domain D t A large number of unlabeled samples are used for text preprocessing to remove useless information and reduce noise interference. Use the parse tree function ElmentTree under the toolkit xml.tree to extract the Internet label corpus Comment sentences between;

[0037] Step2. Perform morphological restoration of the text, eliminate redundant features, and vectorize the text to obtain the initial features of the text; and use the chi-square test feature selection method to filter out pivot ...

Embodiment 2

[0052] Embodiment 2: as Figure 1-5 As shown, the neural structure of improved feature selection corresponds to learning a cross-domain emotion classification method, and the specific steps of the classification method are as follows:

[0053] Step1, using the Amazon product review dataset. The data statistics table is shown in Table 1, and two different fields are selected as the source domain D s and target domain D t ; Since the data set is Internet tag data, use the parse tree function Element.Tree under the xml.etree tree to extract the Internet tag corpus Between the comment sentences, the text content of the source domain and the target domain are obtained. will come from source domain D s A small number of labeled samples of and the source domain D s , target domain D t A large number of unlabeled samples are processed by removing stop words to reduce noise interference.

[0054] Table 1 Amazon product review statistics table

[0055] data set posi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a neural structure corresponding learning cross-domain emotion classification method for improving feature selection, and belongs to the field of natural language processing.The method comprises the following steps: firstly, selecting two different fields in an Amazon comment data set as a source domain and a target domain, preprocessing source domain and target domain data to obtain text contents of the source domain and the target domain, secondly, carrying out word form restoration on a text, eliminating redundant features, and carrying out vectorization processingon the text to obtain initial features of the text; screening out pivot features through a chi-square test feature selection method to serve as pivot features in cross-domain tasks, and the rest features being non-pivot features; performing pivot feature prediction on the non-pivot features in the two fields through neural structure corresponding learning by utilizing the obtained pivot featuresto obtain feature migration; and training a logistic classifier by using the initial features and the migration features of the source domain text, and testing by using the text features and the migration features of the target domain to obtain a classification result of the target domain.

Description

technical field [0001] The invention relates to a neural structure correspondence learning cross-domain emotion classification method for improved feature selection, and belongs to the technical field of natural language processing. Background technique [0002] A domain refers to a class of entities, and different domains are different types of entities. For example, BOOK and DVD can be regarded as different domains. Because product reviews have strong domain characteristics. The emotional information expressed by the same word in the two domains is not necessarily the same. Additionally, sentiment classifiers trained on labeled corpora in one domain have reduced classification accuracy in another. In the domain adaptation work of feature transfer, the most typical method is Structural Correspondence Learning (SCL), which establishes the correlation between the source domain and the target domain through pivot features based on different domains. However, when the tradi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/27G06K9/62
CPCG06F18/24G06F18/214
Inventor 相艳梁俊葛余正涛线岩团熊馨许莹
Owner KUNMING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products