Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target classification and positioning method based on network supervision

A target classification and positioning method technology, which is applied in the field of target classification and positioning based on network supervision, can solve the problems of weakly supervised learning and matching performance, and achieve the effects of avoiding network over-fitting, good positioning performance, and improved fine classification performance

Active Publication Date: 2020-01-21
UNIVERSITY OF CHINESE ACADEMY OF SCIENCES
View PDF13 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The above network supervised learning methods have more or less added human intervention or auxiliary labeling, and the performance is still not comparable to weakly supervised learning
Therefore, there are still many problems in the network supervised learning method, and there is a lot of room for improvement.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target classification and positioning method based on network supervision
  • Target classification and positioning method based on network supervision
  • Target classification and positioning method based on network supervision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0102] 1. Database and sample classification

[0103] Using the present invention to perform network supervision target classification and positioning does not require any data set help in the application stage. However, after the classification and positioning network training is completed, the present invention requires a stable test set to verify the classification accuracy of the classification network and the positioning network The positioning accuracy of the training set is limited by the test set. At present, the existing data set for weakly supervised classification and positioning tasks, the CUB_200_2011 data set can well meet the needs of the experimental test set.

[0104] Such as Figure 8 As shown, the CUB_200_2011 data set is an improved version of the CUB_200 data set, which contains image data of 200 species of birds, the total number of images is 11788, and the test set is 5794, which can be used to evaluate fine classification tasks; the test set Each image has ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a target classification and positioning method based on network supervision. The target classification and positioning method comprises the following steps: automatically obtaining a large amount of network image data from a search engine according to the category of a to-be-tested target; filtering to remove noise images to form a training sample set; preliminarily constructing a classification and positioning network; and inputting samples in the training sample set into the preliminarily constructed classification and positioning network to perform feature extraction,classifying the features, obtaining position information of the target object, and training the classification and positioning network. According to the end-to-end fine classification and positioningmethod based on network supervision, massive network images easy to obtain are used as a training set, manual annotation is completely removed, only image-level labels are used, an efficient convolutional network is designed, and algorithms such as global average pooling and class activation mapping graphs are fused, so that the performance of the method exceeds that of a weak supervised learningmethod on fine classification tasks and positioning tasks.

Description

Technical field [0001] The present invention relates to the field of computer vision and image processing, and in particular to a method of object classification and positioning based on network supervision that can be used for intelligent automatic identification and other directions. The method can be widely used in the field of automatic identification of mobile phone photographs. Background technique [0002] Target positioning and detection tasks under fully-supervised and weak-supervised learning have developed rapidly in recent years, and the most advanced performance is constantly being refreshed. How to further improve performance? Obviously, designing deeper networks or using more training data are two directions that researchers are exploring. In fact, designing a deeper network is bound to expand data, so how to provide more data to the network is a key issue to be studied. [0003] With a large amount of online visual data, the Internet and social media have become th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/084G06N3/045G06F18/2415G06F18/214
Inventor 叶齐祥付梦莹万方韩振军焦建彬
Owner UNIVERSITY OF CHINESE ACADEMY OF SCIENCES
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products