Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A text classification method based on a generative multi-task learning model

A multi-task learning and text classification technology, applied in the field of text classification based on a generative multi-task learning model, can solve problems such as difficult comprehensive optimization of classification effects, and achieve the effect of improving classification performance and enhancing semantic association

Active Publication Date: 2021-07-16
湖南数定智能科技有限公司
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention proposes a text classification method based on a generative multi-task learning model, which is used to overcome the defect that the classification effect is difficult to be comprehensively optimized due to the lack of semantic association in various classification models in the prior art. Label classification tasks and hierarchical classification tasks are integrated into a multi-task classification model to achieve the purpose of improving sub-task classification performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A text classification method based on a generative multi-task learning model
  • A text classification method based on a generative multi-task learning model
  • A text classification method based on a generative multi-task learning model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0015] An embodiment of the present invention provides a text classification method based on a generative multi-task learning model.

[0016] Such as figure 1 As shown, the framework implemented by the present invention includes a data input and preprocessing module, a word embedding module, and an encoding module. The multi-label classification task and the hierarchical classification task have independent attention mechanisms respectively, and include indepe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention constructs a text classification method based on a generative multi-task learning model, and alternately decodes the multi-task classification model according to the multi-label classification task and hierarchical classification task through the training set, and obtains the current Time semantic features and training; optimize the multi-task classification model according to the training results and the real label set to obtain the optimized multi-task classification model; input the data in the test set into the above-mentioned optimized multi-task classification model, and obtain classification labels respectively to complete the classification ; During the training process, the above model constructs the semantic correlation between the multi-label classification model and the hierarchical classification model through the multi-task mechanism, so the semantic correlation between the prediction results of the two sub-tasks can be enhanced, so as to achieve the purpose of improving the classification performance of the sub-tasks .

Description

technical field [0001] The invention belongs to the technical field of natural language processing text classification, and in particular relates to a text classification method based on a generative multi-task learning model. Background technique [0002] Text multi-label classification and hierarchical classification are problems that often need to be solved in practical applications. They are also two important branches of text classification tasks and current research hotspots. In practical applications, many data are ambiguous, and an instance may correspond to multiple category labels in the label set. The purpose of text multi-label classification is to establish a one-to-many association between text and label set. Compared with traditional single-label classification, multi-label classification is a method that is more in line with real application scenarios. For example, in the public security business scenario, the process of case acceptance will generate a large...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/35
CPCG06F16/35
Inventor 谢松县高辉陈仲生彭立宏
Owner 湖南数定智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products