Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Operation device and method of accelerating chip which accelerates depth neural network algorithm

A deep neural network and acceleration chip technology, applied in the field of computing devices for acceleration chips, can solve problems such as non-compliance, increased chip power consumption, and increased number of times to read intermediate values.

Active Publication Date: 2017-03-22
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF5 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

A major problem that will be faced is that a large number of intermediate values ​​are generated and need to be stored, so that the required main memory space increases
At the same time, this method increases the number of times the intermediate value is stored in the main memory or read from the main memory, and the power consumption of the chip increases, which does not conform to the low-power accelerator chip design concept described above.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Operation device and method of accelerating chip which accelerates depth neural network algorithm
  • Operation device and method of accelerating chip which accelerates depth neural network algorithm
  • Operation device and method of accelerating chip which accelerates depth neural network algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] In order to make the object, technical solution and advantages of the present invention clearer, the computing device and method of the acceleration chip for accelerating the deep neural network algorithm of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0064] The computing device of the accelerator chip for accelerating the deep neural network algorithm is a part of a computing system, which interacts with the external processor for data through the data bus. figure 1 It is a diagram of the relationship between each constituent module and the main memory of the operation device of the acceleration chip of the accelerated deep neural network algorithm of the present invention, the device includes a main memory 5, a vector addition processor 1, a vector function value operator 2 and a vector multiplication Adder 3. Among them, the vector addition processor 1 , the vector function value operator vector 2 a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an operation device and method of an accelerating chip which accelerates a depth neural network algorithm. The device comprises a vector addition processor module which carries out addition or subtraction of vectors and / or the vector operation of a pooling layer algorithm in the depth neural network algorithm, a vector function value operator module which carries out the vector operation of the nonlinear evaluation in the depth neural network algorithm, and a vector multiply adder module which carries out the multiply addition operation of the vectors. The three modules execute a programmable instruction and interact with each other to calculate a neural network output result and a synaptic weight change amount which represents the neuron action intensity between neural layers. The three modules are provided with middle value storage areas and carry out reading and writing operations on a master memory. Thus, the reading and writing frequencies of the middle value of the master memory can be reduced, the energy consumption of the accelerator chip can be reduced, and the data loss and replacement problems in a data processing process can be avoided.

Description

technical field [0001] The invention belongs to the field of neural network algorithm and the field of computer hardware. More specifically, the present invention relates to a computing device and method of an acceleration chip for accelerating deep neural network algorithms. Background technique [0002] Artificial neural network algorithm is a research hotspot in the field of artificial intelligence since the 1980s. It abstracts the human brain neuron network from the perspective of information processing, establishes a simple model, and forms different networks according to different connection methods. It has a self-learning function, which can gradually learn to recognize and predict through training; associative storage function, with high algorithm robustness; high parallelism, with the ability to find optimal solutions at high speed, and can quickly find optimal solutions for complex problems of big data; It has strong plasticity and can fully approach any complex ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/06
CPCG06N3/063G06N3/084G06N3/045G06F17/16
Inventor 李震刘少礼张士锦罗韬钱诚陈云霁陈天石
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products