Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Deep Learning Text Classification Method Integrating Shallow Semantic Representation Vectors

A deep learning and text classification technology, applied in neural learning methods, text database clustering/classification, semantic analysis, etc., can solve the problem that the accuracy and reliability fail to reach the practical level, the lack of prior knowledge of deep learning, and it is difficult to effectively Using prior knowledge and other issues to achieve the effect of effectively utilizing prior knowledge

Active Publication Date: 2022-06-07
HUAQIAO UNIVERSITY
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But in general, the accuracy and reliability are far from reaching the practical level, which is caused by the lack of prior knowledge of deep learning
Because the deep learning model driven by big data can only find statistical conclusions in the data set, it is difficult to effectively use prior knowledge

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Deep Learning Text Classification Method Integrating Shallow Semantic Representation Vectors
  • A Deep Learning Text Classification Method Integrating Shallow Semantic Representation Vectors
  • A Deep Learning Text Classification Method Integrating Shallow Semantic Representation Vectors

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be further described below in conjunction with specific embodiments. It should be understood that these examples are only used to illustrate the present invention and not to limit the scope of the present invention. In addition, it should be understood that after reading the content taught by the present invention, those skilled in the art can make various changes or modifications to the present invention, and these equivalent forms also fall within the scope defined by the appended claims of the present application.

[0029] see figure 1 and figure 2 As shown, a deep learning text classification method integrating shallow semantic representation vector of the present invention includes the following steps: (1) constructing a word embedding vector; (2) constructing a shallow semantic vector; (3) constructing a CNN text classifier .

[0030] Taking emotion classification as an example, three emotion datasets are selected for experiments, in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep learning text classification method integrating shallow semantic representation vectors. The method includes: first training word embedding vectors on text corpus, and secondly using domain vocabulary dictionary as shallow semantic vocabulary to generate text based on shallow semantic vocabulary A shallow semantic vector representation of each word in the corpus. Next, the weighted splicing of the two word vectors is input into the CNN model as a new word vector for feature extraction and model training to build a text classifier. The invention solves the defect that the big data-driven word vector expression lacks lexical features and knowledge representation, and it is difficult to truly understand the semantic information of the words, so that the model has richer feature expressions and higher classification performance.

Description

technical field [0001] The invention relates to the field of deep learning and text classification, in particular to a deep learning text classification method integrating shallow semantic representation vectors. Background technique [0002] Text classification refers to the process of predicting category attribution for a large amount of unstructured text corpus according to a given classification system. With the breakthrough of deep learning technology, word embedding technology represented by word2vec and deep learning model represented by convolutional neural network have achieved good results in text classification. But overall, the accuracy and reliability are far from practical levels, which is caused by the lack of prior knowledge of deep learning. Because big data-driven deep learning models can only find statistical conclusions in the data set, it is difficult to effectively utilize prior knowledge. Integrating prior knowledge into deep learning models is an id...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/35G06F40/30G06F40/289G06F40/242G06N3/04G06N3/08
CPCG06F16/35G06N3/08G06F40/242G06F40/289G06F40/30G06N3/045
Inventor 王华珍李小整何霆贺惠新李弼程
Owner HUAQIAO UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products