Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network compression method of considering load balance

A neural network and compression ratio technology, applied in the field of deep neural network compression considering load balancing, can solve problems such as limited acceleration, CPU and GPU cannot fully enjoy

Active Publication Date: 2017-10-10
XILINX TECH BEIJING LTD
View PDF6 Cites 68 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0051] However, the CPU and GPU cannot fully enjoy the benefits of the sparse network model, and the acceleration achieved is limited

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network compression method of considering load balance
  • Deep neural network compression method of considering load balance
  • Deep neural network compression method of considering load balance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0078] Inventor's past research results

[0079] As in the inventor's previous article "Learning both weights and connections efficient neural networks", a method for compressing neural networks (eg, CNN) by pruning has been proposed. The method includes the following steps.

[0080] In the initialization step, the weights of the convolutional layer and the FC layer are initialized to random values, wherein a fully connected ANN is generated, and the connection has a weight parameter,

[0081] In the training step, the ANN is trained, and the weight of the ANN is adjusted according to the accuracy of the ANN until the accuracy reaches a predetermined standard. The training step adjusts the weight of the ANN based on the stochastic gradient descent algorithm, that is, randomly adjusts the weight value, and selects based on the accuracy change of the ANN. For an introduction to the stochastic gradient algorithm, see "Learning bothweights and connections for efficient neural ne...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method of compressing a neural network. The connection relation of the neural elements of the neural network is expressed by a plurality of matrixes. The compression method comprises steps that a dividing step: a plurality of matrixes are divided into a plurality of matrixes, and every matrix is divided into a plurality of sub-matrixes; a compressing step: every sub-matrix is compressed, and every sub-matrix is compressed into a sparse matrix; and an encoding step: every compressed sparse sub-matrix is encoded. The invention provides a device used for compressing the neural network.

Description

[0001] This application claims priority to US Patent Application No. 15 / 242,622, filed August 22, 2016, and US Patent Application No. 15 / 242,624, filed August 22, 2016. field of invention [0002] The invention relates to a deep neural network compression method and device considering load balancing. Background technique [0003] Compression of Artificial Neural Networks [0004] Artificial Neural Networks (ANNs), also referred to as neural networks (NNs), is a mathematical computing model that imitates the behavioral characteristics of animal neural networks and performs distributed parallel information processing. In recent years, neural networks have developed rapidly and are widely used in many fields, such as image recognition, speech recognition, natural language processing, weather forecast, gene expression, content push and so on. [0005] In a neural network, there are a large number of nodes (also called "neurons") connected to each other. The neural network has ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCH04L63/10G06N3/04G06N3/063G06N3/08G10L15/063G10L15/16G06N3/045
Inventor 李鑫陆智麟单羿
Owner XILINX TECH BEIJING LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products