Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Few sample classification method based on attention model

A technology of attention model and classification method, which is applied in the field of classification of a small number of samples based on the attention model, which can solve the problem of not discovering the importance information of the same type of pictures, seldom paying attention to the difference in the amount of information of sample pictures, and the effect is not very good, etc. problem, to achieve the effect of fast calculation, less computing resources, and simple calculation model

Pending Publication Date: 2019-11-12
TIANJIN UNIV
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although with the increase of researchers, image recognition with a small number of samples has made some progress in recent years, most of the work focuses on learning a measure or designing a measure method, resulting in many methods with minimal innovation, and these works pay little attention to samples. The amount of information contained in the picture itself is different, and the important information contained in the same type of picture itself has not been discovered, so although some progress has been made, the effect is not very good

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The method for classifying a small number of samples based on the attention model of the present invention will be described in detail below in conjunction with the embodiments.

[0023] The method for classifying a small number of samples based on the attention model of the present invention aims to propose an attention model that pays attention to more important samples, assigning large weights to more important sample images and assigning lower weights to unimportant samples, so that Classification of important sample-dominated models. Specifically include the following steps:

[0024] 1) Train a convolutional neural network image classification model. After training the convolutional neural network image classification model, remove the fully connected layer of the convolutional neural network image classification model and retain the convolutional neural network image classification model. Network part; the convolutional neural network image classification model i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a few sample classification method based on attention model, which comprises the steps of training a convolutional neural network image classification model, removing a full connection layer of the convolutional neural network image classification model after the convolutional neural network image classification model is trained, and reserving a convolutional neural network part of the convolutional neural network image classification model; performing basic classification on the to-be-detected image according to the feature vectors of all the sample images to obtain n* m scalar quantities, n being the number of classes of all the sample images, and m being the number of sample images in each class; connecting m sample images of each type into an m * M-dimensionalvector, and inputting a nonlinear mapping function to obtain n * m relative weights; multiplying the obtained n * m scalar quantities by the corresponding n * m relative weights to obtain new n * m scalar quantities, and adding the new m scalar quantities in each class to obtain a final classification result. According to the method, by mining the score of the importance of the intra-class images, the model can pay more attention to important samples, and the intra-class images are balanced.

Description

technical field [0001] The invention relates to a sample classification method. In particular, it concerns an attention model-based few-shot classification method for computer vision. Background technique [0002] The recognition of image objects is one of the most basic and valuable directions in the field of computer vision, and it is the basis of other image processing methods. At present, most image recognition is based on deep learning networks. Although very high classification accuracy can be achieved in many scenarios, these methods require millions of data sets to support network training, and the acquisition cost of samples is extremely high. At the same time, the training and use of the network requires a very long time and a large amount of computing resources. It usually takes several days on the current advanced GPU equipment. After training, it may take several seconds to recognize an image. It is unacceptable especially for some embedded devices. Although i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/214G06F18/24
Inventor 冀中柴星亮
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products