Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Device and method for executing batch normalization operation

A technology of computing modules and computing units, applied in the field of artificial neural networks, can solve problems such as off-chip bandwidth performance bottlenecks, and achieve the effects of avoiding performance bottlenecks, reducing memory access bandwidth, and normalizing serial operations and parallel operations

Active Publication Date: 2017-11-10
CAMBRICON TECH CO LTD
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the GPU is a device specially used to perform graphics and image calculations and scientific calculations, it does not have special support for multi-layer artificial neural network batch normalization operations, and still requires a lot of front-end decoding work to perform multi-layer artificial neural network operations, which brings a lot of overhead
In addition, the GPU has only a small on-chip cache, and the model data of the multi-layer artificial neural network batchnormalization needs to be repeatedly moved from off-chip, and off-chip bandwidth has become the main performance bottleneck.
And the batch normalization operation has a large number of normalization operations such as summation. The parallel architecture of the GPU is not suitable for such a large number of normalization operations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Device and method for executing batch normalization operation
  • Device and method for executing batch normalization operation
  • Device and method for executing batch normalization operation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0018] The batch normalization operation includes forward and reverse parts. Both the forward and reverse of the batch normalization operation need to be used during the training process of the artificial neural network, while only the forward process of the batch normalization operation is performed during the use of the artificial neural network. In the process of using the artificial neural network, the parameters obtained in the training process are used, such as the mean value, variance and other data in the batch normalization operation do not need to be repeatedly calculated.

[0019] figure 1 It shows the overall structure diagram of the device for performing artificial neural network batch normalization operatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a device for executing batch normalization operation. The device comprises an instruction storage unit, a controller unit, a data access unit and an operation unit. By use of the device, batch normalization operation in a multilayer artificial neural network can be realized. According to the batch normalization operation, in the forward process, a mean value is subtracted from input, and the result is divided by the square root of the sum of a variance and a minimum definite value; and then the result is multiplied by a learning parameter alpha plus a learning parameter beta, and output of the layer is obtained. In a reverse training process, a mean value vector of a gradient vector is subtracted from the input gradient vector, a mean value of the product of the gradient vector and output obtained forwards is subtracted from the result, the result is multiplied by the output, the obtained difference value is divided by the square root of the sum of a forward variance and a minimum definite value, and an output gradient vector of the layer is obtained. Through the device, support to batch normalization forward and reverse operation in the artificial neural network is effectively improved.

Description

technical field [0001] The present invention relates to artificial neural network technology, in particular to a device and method for performing forward and reverse operations of batch normalization in artificial neural network. Background technique [0002] Multi-layer artificial neural network is widely used in the fields of pattern recognition, image processing, function approximation and optimization calculation. In recent years, multi-layer artificial neural network has been favored by academic circles due to its high recognition accuracy and good parallelism. And the industry is getting more and more attention, and the batch normalization operation in the multi-layer artificial neural network is more and more used in the multi-layer neural network because it can accelerate the training speed of the neural network and improve the recognition accuracy. [0003] One known method to support batch normalization operations is to use a general-purpose processor. The method ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/06
CPCG06N3/063G06N3/084G06N3/08G06N3/045
Inventor 刘少礼于涌陈云霁陈天石
Owner CAMBRICON TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products