Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Text classification method based on generative multi-task learning model

A multi-task learning and text classification technology, applied in the field of text classification based on a generative multi-task learning model, can solve problems such as difficult comprehensive optimization of classification effects, and achieve the effect of improving classification performance and enhancing semantic association

Active Publication Date: 2019-10-18
湖南数定智能科技有限公司
View PDF13 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention proposes a text classification method based on a generative multi-task learning model, which is used to overcome the defect that the classification effect is difficult to be comprehensively optimized due to the lack of semantic association in various classification models in the prior art. Label classification tasks and hierarchical classification tasks are integrated into a multi-task classification model to achieve the purpose of improving sub-task classification performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text classification method based on generative multi-task learning model
  • Text classification method based on generative multi-task learning model
  • Text classification method based on generative multi-task learning model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0014] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0015] An embodiment of the present invention provides a text classification method based on a generative multi-task learning model.

[0016] Such as figure 1 As shown, the framework implemented by the present invention includes a data input and preprocessing module, a word embedding module, and an encoding module. The multi-label classification task and the hierarchical classification task have independent attention mechanisms respectively, and include indepe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a text classification method based on a generative multi-task learning model, and the method comprises the steps: alternately decoding a multi-task classification model througha training set according to a multi-label classification task and a hierarchical classification task, obtaining a current-moment semantic feature related to a coding moment feature vector through decoding, and carrying out the training of the current-moment semantic feature; optimizing the multi-task classification model according to the training result and the real label set to obtain an optimized multi-task classification model; inputting data in a to-be-detected set into the optimized multi-task classification model, and respectively obtaining classification labels to finish classification. The semantic association between the multi-label classification model and the hierarchical classification model is constructed through a multi-task mechanism in the training process of the model, sothat the semantic association between two sub-task prediction results can be enhanced, and the purpose of improving the sub-task classification performance is achieved.

Description

technical field [0001] The invention belongs to the technical field of natural language processing text classification, and in particular relates to a text classification method based on a generative multi-task learning model. Background technique [0002] Text multi-label classification and hierarchical classification are problems that often need to be solved in practical applications. They are also two important branches of text classification tasks and current research hotspots. In practical applications, many data are ambiguous, and an instance may correspond to multiple category labels in the label set. The purpose of text multi-label classification is to establish a one-to-many association between text and label set. Compared with traditional single-label classification, multi-label classification is a method that is more in line with real application scenarios. For example, in the public security business scenario, the process of case acceptance will generate a large...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/35
CPCG06F16/35
Inventor 谢松县高辉陈仲生彭立宏
Owner 湖南数定智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products