Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network compression method and device, computer equipment and storage medium

A neural network and compression method technology, applied in the field of neural network compression method, computer equipment and storage medium, and devices, can solve the problems of low accuracy rate of students' network

Pending Publication Date: 2020-03-13
TENCENT TECH (SHENZHEN) CO LTD
View PDF0 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in the scheme shown in the above-mentioned related technologies, due to too little training data, the student network is prone to overfitting, which leads to a low accuracy rate of the student network obtained by model compression.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network compression method and device, computer equipment and storage medium
  • Neural network compression method and device, computer equipment and storage medium
  • Neural network compression method and device, computer equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

example

[0168] target sparsity r';

[0169] output:

[0170] Lightweight Student Network F S

[0171]

[0172]

[0173] For the network compression algorithm under very little data, in addition to the several schemes provided under the above steps 404 to 406, there is another possible technical alternative: replace the connection method of cross distillation with the student network and teacher Data augmentation is performed on the feature map of the hidden layer of the network, including adding Gaussian noise to the feature map, linear interpolation on the feature map corresponding to different inputs to obtain more intermediate data, rotating and scaling the feature map to obtain various The generalized intermediate signal enhances the generalization ability of the model.

[0174]To sum up, in the scheme shown in the embodiment of this application, by inputting the training samples into the teacher network and the student network respectively; obtaining the first network da...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a neural network compression method and a device, computer equipment and a storage medium, and relates to the technical field of neural networks. The method comprises the following steps: respectively inputting training samples into a teacher network and a student network; obtaining first network data including a first model parameter of an ith layer and a first feature map in a teacher network, and performing cross calculation on the first network data and the second network data to obtain a loss function value, and updating the second model parameter of the ith layerin the student network according to the loss function value, wherein the second network data comprises the second model parameter of the ith layer and the second feature graph. According to the method, the accuracy of the compressed neural network can be improved under the scene that the trained neural network is compressed through a small amount of training data.

Description

technical field [0001] The embodiments of the present application relate to the technical field of neural networks, and in particular to a neural network compression method, device, computer equipment, and storage medium. Background technique [0002] In recent years, as data privacy issues in neural network compression have become increasingly prominent, the research on algorithms for compressing original models based on no training data or a small amount of training data has also attracted more and more attention from the industry. [0003] In related technologies, the compression algorithm of the neural network model without training data or based on a small amount of training data is mainly realized through the framework of knowledge distillation. For example, the trained neural network model to be compressed is used as the teacher network, and another neural network with the same structure and smaller model size than the teacher network is set as the student network, an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/082G06N3/045
Inventor 柏昊立吴家祥侯金龙
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products