Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Local information and global information fusion-based target classification identification method

A technology of local information and global information, applied in the field of target recognition, can solve the problems of high classification cost and low classification accuracy, and achieve the effect of realizing intelligence and improving classification accuracy

Active Publication Date: 2017-10-20
NORTHWESTERN POLYTECHNICAL UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a target classification and recognition method that combines local information and global information to solve the problems of low classification accuracy and high classification cost in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Local information and global information fusion-based target classification identification method
  • Local information and global information fusion-based target classification identification method
  • Local information and global information fusion-based target classification identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0053] Embodiment: Actual Data Category Recognition Technology

[0054] The newly proposed idea can be applied in many classification processes, and the use of specific clustering algorithms and basic classifiers can be selected according to actual requirements. The effect of the present invention was tested by using a real data set as a test sample. In this experiment, the basic classifiers used are: Naive Bayesian classifier, SVM classifier, and ENN classifier. The invention optimizes the output result of the basic classifier to obtain better classification results, improves the accuracy of classification, and reduces the complexity of classification.

[0055] Through the UCI database to obtain multiple groups of actual data sets as test samples, the performance of the present invention is tested through the classification output of 3 basic classifiers. The basic information of the basic dataset is shown in Table 1:

[0056] Data

[0057] Table 1

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a local information and global information fusion-based target classification identification method. The method comprises the steps of extracting local information of a training sample set through a clustering thought to obtain K types of data, and performing calculation to obtain a clustering center corresponding to each type of the data; obtaining an initial classification result for each piece of the data in each type, performing correction on the initial classification result, and performing calculation to obtain a deviation between a corrected classification result and a true value; calculating a distance weight factor of the initial classification result of each piece of the data in each type and the true value; and for a target sample, calculating a distance between the target sample and each clustering center to adaptively select a corresponding effective correction matrix for correcting the initial classification result of the target sample, thereby obtaining a final classification result of a target. The clustering thought is applied to a process for improving the accuracy of data classification; and through the local information and global information fusion-based classification correction method, an output of a classifier is corrected to enable the identification output of the classifier to be closer to the true value, so that the classification precision of target identification can be effectively improved.

Description

【Technical field】 [0001] The invention belongs to the technical field of target recognition, and in particular relates to a target classification and recognition method in which local information and global information are fused. 【Background technique】 [0002] Multi-sensor target automatic recognition technology in confrontation environment plays a very important role in the military field. Due to the rapid development of methods such as non-cooperative target deception and interference, and the continuous use of multiple and heterogeneous sensors, the characteristics of uncertainty and high conflict in observation information have become increasingly prominent. In this context, how to improve the accuracy of target recognition has become an increasingly important issue. [0003] At present, the solutions to the target recognition problem of uncertain observation data mainly include multi-classifier fusion and construction of new classifiers. Multi-classifier fusion techn...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/24
Inventor 刘准钆周平刘永超潘泉
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products