Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Structure-Extended Multinomial Naive Bayesian Text Classification Method

A text classification and polynomial technology, which is applied in the fields of unstructured text data retrieval, text database clustering/classification, special data processing applications, etc., can solve the undiscovered polynomial Naive Bayesian text classification model structure extension method and other problems, Achieve the effect of avoiding the structure learning stage and saving space resources

Inactive Publication Date: 2018-05-01
CHINA UNIV OF GEOSCIENCES (WUHAN)
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the high-dimensionality of the text data itself, no structural extension method for improving the multinomial Naive Bayesian text classification model has been found so far.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Structure-Extended Multinomial Naive Bayesian Text Classification Method
  • A Structure-Extended Multinomial Naive Bayesian Text Classification Method
  • A Structure-Extended Multinomial Naive Bayesian Text Classification Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention will be further described below in conjunction with embodiment.

[0041] The present invention provides a multinomial naive Bayesian text classification method with extended structure, including a training phase and a classification phase, wherein,

[0042] (1) The training phase includes the following processes:

[0043] (1-1) Use the following formula to calculate the prior probability p(c) of each category in the training document set D:

[0044]

[0045]Among them, the training document set D is a known document set, and any document d in the training document set D is expressed as a word vector form d=1 ,w 2 ,...w m >, where w i is the i-th word in the document d, m is the number of words in the training document set D; n is the number of documents in the training document set D, s is the number of document categories, c j is the class label of the jth document, δ(c j ,c) represents a binary function, when its two parameters are the sa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a structure extended polynomial naive Bayes text classification method. Firstly, a one-dependence polynomial estimator is established by using each word that occurs in a test document as a father node and then all the one-dependence polynomial estimators are subjected to weighted averaging to predict a category of the test document, wherein the weight is an information gain ratio of each word. According to the method, the structure learning phase of a Bayesian network is avoided, thereby reducing time spending brought by high dimensionality of text data; and meanwhile, the estimation process of a dual conditional probability is postponed to the classification stage, thereby ingeniously saving large space cost. According to the method, not only is classification accuracy of a polynomial naive Bayes text classifier improved, but also time spending and space cost of structure learning of the Bayesian network are avoided.

Description

technical field [0001] The invention relates to a structure-expanded polynomial naive Bayesian text classification method, which belongs to the technical field of artificial intelligence data mining classification. Background technique [0002] Naive Bayesian text classifier is often used to deal with text classification problems because of its simplicity and efficiency, but its attribute independence assumption affects its classification performance to some extent while making it efficient. Given a document d, the document is represented as a word vector in the form <w 1 ,w 2 ,...,w m >, Multinomial Naive Bayes (MNB) uses Equation 10 to classify document d. [0003] [0004] In the above formula, the symbol C is the set of class labels, m is the number of words, and w i (i=1,2,...m) is the i-th word appearing in document d, f i is the word w i The frequency of appearance in document d, the prior probability p(c) is estimated by formula 1, and the conditional ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F17/30
CPCG06F16/35
Inventor 蒋良孝王沙沙李超群张伦干
Owner CHINA UNIV OF GEOSCIENCES (WUHAN)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products