Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Principal component analysis (PCA)-based neural network pruning method

A neural network and pruning technology, which is applied in the field of PCA-based neural network pruning, can solve the problems of large model parameters and calculations, decreased accuracy, neglect of information, etc., to achieve excellent performance, reduce costs, and avoid excessive pruning or under-trimmed effect

Pending Publication Date: 2021-11-12
SHANGHAI PANCHIP MICROELECTRONICS CO LTD
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Neural networks are more and more widely used in the fields of image recognition, speech recognition and machine translation. However, high-precision models often have a large amount of parameters and calculations, which greatly limits the application of neural networks. Therefore, model compression comes into being. pregnancy
The early model compression algorithm retained the important convolution kernel of each network layer through the absolute value of the parameters or some statistical information of the feature map, but ignored the information of the discarded network layer, resulting in a decrease in accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Principal component analysis (PCA)-based neural network pruning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The following describes several preferred embodiments of the present invention with reference to the accompanying drawings, so as to make the technical content clearer and easier to understand. The present invention can be embodied in many different forms of embodiments, and the protection scope of the present invention is not limited to the embodiments mentioned herein.

[0034] In the drawings, components with the same structure are denoted by the same numerals, and components with similar structures or functions are denoted by similar numerals. The size and thickness of each component shown in the drawings are shown arbitrarily, and the present invention does not limit the size and thickness of each component. In order to make the illustration clearer, the thickness of parts is appropriately exaggerated in some places in the drawings.

[0035] Such as figure 1 As shown, it is a flow chart of the method of a preferred embodiment of the present invention. First, a pe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network pruning method based on PCA, and relates to the technical field of pruning, and the method comprises the steps: setting a percentage parameter of reserved information; extracting convolution kernel parameters of each layer from top to bottom; analyzing and transforming convolution kernel parameters by using PCA, and calculating the number and parameters of convolution kernels to be reserved according to percentage parameters; reconstructing a neural network model through the reserved parameters of each layer; and performing fine adjustment on the model by using the training data to obtain a pruned model. According to the new neural network pruning method provided by the invention, main components of the model are reserved, redundant information of a convolutional layer is removed, and parameters and calculation amount are reduced, so the model can be deployed on a low-power-consumption platform, and cost overhead is reduced.

Description

technical field [0001] The invention relates to the technical field of pruning, in particular to a PCA-based neural network pruning method. Background technique [0002] Neural networks are more and more widely used in the fields of image recognition, speech recognition, and machine translation. However, high-precision models often have a large amount of parameters and calculations, which greatly limits the application of neural networks. Therefore, model compression comes into being. pregnancy. The early model compression algorithm retained the important convolution kernel of each network layer through the absolute value of the parameters or some statistical information of the feature map, but ignored the information of the discarded network layer, resulting in a decrease in accuracy. [0003] Therefore, those skilled in the art are committed to developing a PCA-based neural network compression method. By setting the percentage of information retained in each layer, the re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04G06K9/62
CPCG06N3/082G06N3/045G06F18/2135
Inventor 张晋侨赵新
Owner SHANGHAI PANCHIP MICROELECTRONICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products