Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Small sample image classification method based on memory mechanism and graph neural network

A neural network and classification method technology, applied in the field of small sample image classification, can solve the problems that the learned concepts cannot be predicted and ignored by model reasoning, and achieve the effect of strong practicability and simple method.

Active Publication Date: 2021-11-23
EAST CHINA NORMAL UNIV
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although meta-learning and episode training strategies have achieved remarkable results in small-sample learning, most of them ignore a key issue, that is, when each scenario (episode) comes to train, how does the knowledge learned in the past face the problem? for new tasks
[0004] In the existing technology, when faced with unknown tasks, the learned concept cannot be used for reasoning and prediction by the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Small sample image classification method based on memory mechanism and graph neural network
  • Small sample image classification method based on memory mechanism and graph neural network
  • Small sample image classification method based on memory mechanism and graph neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] refer to figure 1 , the present invention utilizes graph neural network and memory mechanism to help small-sample models perform reasoning and prediction with the help of learned conceptual knowledge. The specific operation steps are as follows:

[0063] S300: Obtain the trained encoder and classifier from the pre-training stage, and use the encoder and classifier as the initialization of the feature extractor and memory bank in the meta-training stage, respectively.

[0064] S310: Using the episode sampling strategy, sample N classes from the training set, and each class has K samples as a support set, and then from the same N classes, each class samples 1 sample as a query set.

[0065] S320: Input the support set and query set sampled in step S310 into the feature extractor to obtain the feature representation of each sample.

[0066] S330: Make the intra-class mean value of the sample features of the support set to obtain the class center points of the N classes, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a small sample image classification method based on a memory mechanism and a graph neural network, which is characterized in that a small sample model is helped to perform reasoning prediction by means of learned conceptual knowledge, and specifically comprises three stages of pre-training, meta-training and meta-testing, wherein the pre-training takes the trained feature extractor and classifier as initialization weights of an encoder and a memory bank; the meta-training is characterized in that features of samples of a support set and a query set are extracted through an encoder, related information of each class is mined from a memory bank to serve as meta-knowledge, and similarity between task related nodes and the meta-knowledge is propagated through a graph neural network; and the meta-test obtains a classification result through task related nodes and meta-knowledge nodes. Compared with the prior art, the method has the advantages that a human recognition process is used for reference, a memory graph augmentation network based on information bottleneck is used, well-learned conceptual knowledge is used, the model is helped to conduct reasoning prediction, the method is simple and convenient, practicability is high, and certain application and popularization prospects are achieved.

Description

technical field [0001] The invention relates to the technical field of small-sample image classification, in particular to a small-sample image classification method based on a memory mechanism and a graph neural network. Background technique [0002] The success of deep learning stems from a large amount of labeled data, and humans only need to use a small number of samples to have good generalization ability. The gap between the two has caused people to study small sample learning. Different from traditional deep learning scenarios, the purpose of small sample learning is not to classify unknown samples, but to quickly adapt to new tasks in very limited labeled data and past knowledge. [0003] Recently, the idea of ​​combining meta-learning with episode training has achieved significant advantages in solving this problem. Intuitively, using an episode sampling strategy is a promising trend to transfer knowledge from known categories (i.e. known categories with enough tra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/04G06N3/084G06F18/24G06F18/214
Inventor 张志忠谢源刘勋承田旭东马利庄
Owner EAST CHINA NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products