Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network pruning method based on combination of sparse learning and genetic algorithm

A neural network and genetic algorithm technology, applied in the direction of neural learning methods, biological neural network models, genetic rules, etc., can solve the problems of unordered weight removal, low neural network compression rate, and broken neural network data structure. Achieve the effect of reducing precision loss and improving compression ratio

Inactive Publication Date: 2020-05-05
XIDIAN UNIV
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Unstructured pruning is to compress the neural network by removing non-important weights in the neural network, but the positions of the removed weights in this type of method are out of order, thus destroying the original data structure of the neural network. Make the pruning result unable to obtain actual effect on the current general-purpose equipment
[0005] The structured pruning method is to achieve the purpose of compressing the neural network by removing non-important channels in the neural network. Since the removal of channels will not destroy the data structure of the neural network, this type of method can be well applied to existing computing devices. , but the structured pruning method is not as accurate as the unstructured method for the location of redundant components in the neural network, so this type of method has a low compression rate for the neural network and has a greater impact on the performance of the neural network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network pruning method based on combination of sparse learning and genetic algorithm
  • Neural network pruning method based on combination of sparse learning and genetic algorithm
  • Neural network pruning method based on combination of sparse learning and genetic algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The embodiments and effects of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0032] refer to figure 1 , the implementation steps of this example are as follows:

[0033] Step 1, train the neural network using sparse learning.

[0034] (1.1) Use the scaling factors in all channels of the neural network to construct penalty items, the formula of which is as follows:

[0035]

[0036] Among them, R s (γ) represents the penalty item, N represents the total number of layers of the neural network, n l Indicates the total number of channels in the l-th layer of the neural network, γ l,i Indicates the scaling factor of the i-th channel of the l-th layer in the neural network, |γ l,i | means gamma l,i The absolute value of , ε represents a constant constraint term;

[0037] (1.2) In the original cross-entropy loss function f of the neural network old (x) based on the penalty term R s (γ) is added to t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network pruning method based on combination of sparse learning and a genetic algorithm, and mainly solves the problem that a neural network consumes a large amount ofstorage resources and computing resources. The implementation scheme is as follows; training a neural network in a sparse learning mode to obtain a sparse network structure; carrying out heuristic search on potential sub-networks in the trained neural network by utilizing a genetic algorithm and a dynamically adjustable evaluation factor, and automatically searching an optimal sub-network meetingrequirements under the guidance of a fitness function; and retraining the obtained optimal sub-network to obtain a final pruning result. According to the method, the consumption of storage resources and computing resources by the neural network is reduced, the precision of the pruned network is ensured, adjustable pruning can be realized, and the method can be used for compressing the neural network.

Description

technical field [0001] The invention belongs to the technical field of computers, and mainly relates to a neural network pruning method, which can be used for neural network compression. Background technique [0002] With the development of deep learning, neural networks have achieved good results in scientific research and industrial applications. However, compared with traditional algorithms, the implementation of neural networks requires a large amount of storage and computing resources. Therefore, neural networks are used on devices. It will generate a large amount of energy consumption, which is not in line with the concept of energy saving, and also limits the use of neural networks on mobile devices with limited power consumption. As a method of compressing the neural network, neural network pruning reduces the storage consumption and computing consumption of the neural network by removing redundant components in the neural network, so as to achieve the purpose of red...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/12
CPCG06N3/082G06N3/126
Inventor 李甫石光明汪振宇谢雪梅
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products