Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network model training method and device and storage medium

A technology of neural network model and training method, which is applied in the field of devices, storage media, and neural network model training methods, and can solve problems such as not including teacher network knowledge and incomplete knowledge transfer

Inactive Publication Date: 2021-05-04
TSINGHUA UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in these methods, the knowledge learned by the student network is only the knowledge of the trained teacher network, and does not include the knowledge in the training process of the teacher network itself, and the knowledge transfer is not complete.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network model training method and device and storage medium
  • Neural network model training method and device and storage medium
  • Neural network model training method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0087] The following uses Example 1 to illustrate the neural network model training method in the embodiment of the present application:

[0088] example one

[0089] Taking the application scenario of image classification as an example, the benchmark data set is CIFAR-100, the training set and the test set are divided into standard division methods, including 50,000 and 10,000 pictures respectively, the evaluation criterion is the top-1 accuracy rate, and the teacher network chooses ResNet18, ResNet50, DenseNet121, the student network chooses Mobilenetv2, Shufflenetv2, the classifier uses a linear classifier, and the sample enhancement strategy uses random cropping and horizontal flipping.

[0090] The main parameters are set as follows:

[0091]The batch size is 128, the number of iterations is 200, the optimizer is Adam, the encoding tool is Pytorch, and the model training uses the Titan Xp graphics card. The time to save the process model is an integer multiple of 20 epo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network model training method and device and a storage medium, and the method comprises the steps: respectively storing the teacher networks of different time nodes selected in advance as process models in the training process of the teacher networks; integrating the plurality of stored process models to form a new teacher network; and training the student network by using the new teacher network.

Description

technical field [0001] This article relates to the field of neural network model compression, in particular to a neural network model training method, device and storage medium. Background technique [0002] Compared with large-scale deep neural network models, the performance of lightweight neural network models is generally poor, and it is difficult to meet some applications with high performance requirements. Model compression is the most common method for this problem, and generally includes methods such as model pruning, parameter quantization, and knowledge distillation. [0003] Knowledge distillation is a concept proposed by Hinton in 2015. It aims to introduce the knowledge of a pre-trained teacher network (generally a large network with superior performance and high complexity) as a light-weight model for constructing a student network (to be deployed on the application side). Network, poor performance and low complexity) training part of the loss function to achi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06Q50/20
CPCG06N3/08G06Q50/205
Inventor 黄高王朝飞宋士吉杨琪森
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products