Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hierarchical support vector machine classifying method based on rejection subspace

A support vector machine and classification method technology, applied in the field of hierarchical support vector machine classification, can solve the problems of unbalanced number of category samples, classifiers cannot fully utilize effective information, and reduce the size of the original data set.

Active Publication Date: 2013-12-25
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF3 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0012] The negative sample undersampling method can balance the unbalanced data and reduce the size of the original data set, but this method destroys the internal structure of the original data set, so that the classifier cannot make full use of the original data set. valid information
The positive sample oversampling method can also make the unbalanced data become balanced, but this method makes the scale of the original data set expand again, further increasing the computational complexity of training the classifier
Cost-sensitive learning is an effective method to solve the problem of imbalance in the number of class samples. Some scholars have shown that if the cost of misclassification of the class is known, cost-sensitive learning should be used to solve the problem of imbalance in the number of class samples. However, in the actual classification In the problem, the misclassification cost of the category is often unknown

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hierarchical support vector machine classifying method based on rejection subspace
  • Hierarchical support vector machine classifying method based on rejection subspace
  • Hierarchical support vector machine classifying method based on rejection subspace

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Various details involved in the technical solution of the present invention will be described in detail below in conjunction with the accompanying drawings. It should be pointed out that the described embodiments are only intended to facilitate the understanding of the present invention, rather than limiting it in any way.

[0034] 1. Method overview

[0035] figure 1 A flow chart of the hierarchical support vector machine classification method based on the rejection subspace is shown. Because multi-class classification is a series of two-class classification problems, and parallel processing can be achieved by using one-to-many training discriminant criteria, we only discuss the case of two-class classification here. The main steps of the hierarchical support vector machine classification method based on the rejection subspace are as follows:

[0036] Step S1: Divide the data set into a training data set and a verification data set in proportion; determine the total...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a hierarchical support vector machine classifying method based on a rejection subspace. The hierarchical support vector machine classifying method based on the rejection subspace is applicable to processing multi-class or unbalance big data classification problems. The hierarchical support vector machine classifying method is capable of realizing hierarchical parallelization processing on big data in virtue of the rejection subspace so as to improve the classification result. The hierarchical support vector machine classifying method comprises the following steps: firstly, acquiring support vector machines low in computation complexity through training; secondly, determining the rejection subspaces of the support vector machine by virtue of a mutual information learning criterion to obtain rejection training sets in original training sets; and thirdly, training high-accuracy support vector machines on the rejection training sets for further judging the rejection training sets; and the training process is repeated for a plurality of times according to actual requirements. The hierarchical support vector machine classifying method has the advantages that the training complexity of the support vector machine of each layer is reduced according to the idea of dividing and ruling, and the optimal rejection subspace is determined by the data through the mutual information; therefore, the hierarchical support vector machine classifying method has the characteristics of low computation complexity, listening to the data and the like; besides, the method can be applied to the fields of big data classification such as medical diagnosis and multi-class object detection.

Description

technical field [0001] The invention belongs to the technical field of pattern recognition and machine learning, and relates to a hierarchical support vector machine classification method. Background technique [0002] Two common problems in unbalanced big data classification fields such as medical diagnosis and intrusion detection are excessive computational complexity and unbalanced number of class samples. The huge amount of data leads to a great increase in the computational complexity of training classifiers. In addition, the problem of unbalanced number of class samples makes it easy for the classifier to distinguish positive samples as negative samples (in the patent of the present invention, positive samples represent small class samples, and negative class samples represent large class samples). In general, the loss cost of judging a positive sample as a negative sample is significantly higher than the loss cost of judging a negative sample as a positive sample, su...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
Inventor 徐贵标胡包钢
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products