Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network accelerator and data processing method

A neural network and accelerator technology, applied in the field of computing, can solve problems such as excessive consumption of computing resources, reduced utilization of hardware resources, and unbalanced utilization of resources, and achieve the effect of improving the utilization of hardware resources and processing efficiency

Active Publication Date: 2018-08-24
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF6 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The LSTM network is a time-recursive cyclic neural network that can learn long-term dependent information. It is often used to learn language translation, robot control, image analysis, etc. The main part of the calculation process is the multiplication and accumulation operation of each gate value vector and In the iterative operation process of each layer, in the prior art, when performing calculations on LSTM networks, resource usage is often unbalanced. The idle state; and when calculating the amount at the scale of the vector dimension, its computing resource consumption is too large; this leads to a reduction in the utilization of hardware resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network accelerator and data processing method
  • Neural network accelerator and data processing method
  • Neural network accelerator and data processing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to make the purpose, technical solution and advantages of the present invention more clear, the neural network accelerator and data processing method provided in the embodiments of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0025] When calculating the LSTM network, it is mainly for the calculation of the "cell state" that transmits information from the previous unit to the next unit. The LSTM network will use a structure that selectively passes information, that is, the "gate (gate)” to control the discarding or adding of information to the “cell state” to realize the function of forgetting or remembering.

[0026] The formula of the known LSTM model is:

[0027] I t =δ(W xi ·X t +W hi ·H (t-1) +b it ) 1.1

[0028] f t =δ(W xf ·X t +W hf ·H (t-1) +b ft ) 1.2

[0029] o t =δ(W xo ·X t +W ho ·H (t-1) +b ot ) 1.3

[0030] G t =h(W xg ·X t +W hg ·H (t-1) +b gt ) 1.4

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a neural network accelerator. The neural network accelerator comprises a storage unit for storing neuron data and weight data of an LSTM network and outputting the data, a vector multiplying and accumulating matrix unit used for receiving the data from the storage unit, performing vector multiplying and accumulating operations for the received data and outputting operationresults, an addition unit used for receiving the data from the vector multiplying and accumulating matrix unit and performing an offset addition operation for the received data, an activation unit used for receiving the data from the multifunctional operation unit and / or the storage unit, performing an activation operation and outputting an activation result for the received data, and a vector parallel multiplying and accumulating unit used for receiving the data from the activation unit and / or the storage unit and performing multiplying and accumulating operations on the received data. All the modules are connected end to end to form an assembly line working mechanism to carry out data processing together with input vectors by taking weighted row vectors as a unit.

Description

technical field [0001] The invention relates to the computing field, in particular to an LSTM-oriented neural network accelerator and a data processing method. Background technique [0002] Neural network is one of the perception models with a high level of development in the field of artificial intelligence. Once it appeared, it became a research hotspot in academia and industry. With the deepening of research, different types of neural networks have been proposed one after another. For example, long-term and short-term Memory network (LSTM, Long Short-Term Memory). [0003] The LSTM network is a time-recursive cyclic neural network that can learn long-term dependent information. It is often used to learn language translation, robot control, image analysis, etc. The main part of the calculation process is the multiplication and accumulation operation of each gate value vector and In the iterative operation process of each layer, in the prior art, when performing calculatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/063G06N3/08G06F17/16
CPCG06F17/16G06N3/063G06N3/084G06N3/048
Inventor 韩银和闵丰许浩博王颖
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products