Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for pruning neural network model

A neural network model and pruning technology, applied in the field of deep learning, can solve problems such as the promotion of unfavorable pruning schemes

Pending Publication Date: 2020-11-27
GUANGZHOU BAIGUOYUAN INFORMATION TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] This application provides a method and device for pruning a neural network model to solve the problem of the highly encapsulated neural network front-end framework in the existing pruning technology, which makes the model training logic have a very large coupling, which is not conducive to the pruning scheme. promotion problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for pruning neural network model
  • Method and device for pruning neural network model
  • Method and device for pruning neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] figure 1 It is a flow chart of an embodiment of a method for pruning a neural network model provided in Embodiment 1 of the present application. Generally, the pruning technique of the neural network can include the following four stages: 1. Find channels that can be pruned. 2. Calculate the pruning result according to the upper layer pruning algorithm. 3. Get the cropped model. 4. Fine-tune or retrain the cropped model. In this embodiment, stage 1 of the pruning technology "find channels that can be pruned" and stage 3 "obtain the pruned model" are optimized to eliminate the gap between algorithms and engineering and realize fully automatic pruning .

[0032] The method for pruning a neural network model in the embodiment of the present application may be performed by a pruning tool or a pruning device, and may specifically include the following steps:

[0033] Step 110, acquire model structure data of the target neural network model, where the model structure dat...

Embodiment 2

[0059] image 3 It is a flowchart of another embodiment of a method for pruning a neural network model provided in Embodiment 2 of the present application.

[0060] Step 310, acquire model structure data of the target neural network model, where the model structure data includes node information of multiple nodes.

[0061] As an example, the model structure data of the target neural network model may include node information of multiple nodes. Wherein, the node information may include but not limited to: the name of the convolutional layer where the node is located, operation attribute information, parent node index information, channel data, and the like.

[0062] In one embodiment, the front-end framework of the target neural network model can be the MXNet framework, the model structure data can be the json data of the model structure, and the model structure function of the MXNet framework can be called to obtain the json of the model structure of the target neural network...

Embodiment 3

[0092] Figure 4 A structural block diagram of an embodiment of a device for pruning a neural network model provided in Embodiment 3 of the present application. The device for pruning a neural network model may also be called a pruning device, and the device may include:

[0093] A model structure data acquisition module 410, configured to acquire model structure data of the target neural network model, the model structure data including node information of multiple nodes;

[0094] A prunable convolutional layer information determination module 420, configured to determine the pruned convolutional layer information in the target neural network model according to the node information of the plurality of nodes;

[0095] The prunable convolutional layer information sending module 430 is configured to provide the pruned convolutional layer information to the pruning algorithm of the upper layer, so that the pruning algorithm can The layer information determines the pruning result...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and device for pruning a neural network model, and the method comprises: obtaining the model structure data of a target neural network model, wherein the model structure data comprises the node information of a plurality of nodes; determining information of a convolutional layer capable of being pruned in the target neural network model according to the node information of the plurality of nodes; providing the information of the convolutional layer capable of being pruned to a pruning algorithm of an upper layer, so that the pruning algorithm determines a pruning result of a corresponding convolutional layer in the target neural network model according to the information of the convolutional layer capable of being pruned; obtaining a pruning result determined by the pruning algorithm, and modifying the model structure data according to the pruning result so as to construct the pruned target neural network model, so that full automation of the pruning process can be realized, the method can adapt to different network structures and different upper-layer pruning algorithms, barriers in algorithms and engineering can be eliminated, and the requirements of engineering universality, simplicity and convenience can be met.

Description

technical field [0001] The embodiments of the present application relate to deep learning technology, and in particular to a method and device for pruning a neural network model. Background technique [0002] In the field of computer vision, with the continuous development of deep learning technology, although the ability of network expression is gradually enhanced, the resource consumption required is also increasing. Excessive resource consumption cannot be achieved in resource-constrained environments such as edge mobile devices. Satisfied, model size and reasoning speed have become important factors affecting the implementation of deep learning. Therefore, the network pruning technology that can speed up network reasoning and reduce model size has become a hot spot. [0003] Generally, the pruning technology of neural network can include the following four stages: [0004] 1. Find channels that can be trimmed. [0005] 2. Calculate the pruning result according to the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/082G06N3/045
Inventor 项阳
Owner GUANGZHOU BAIGUOYUAN INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products