Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network computing device and processor comprising same

A computing device and neural network technology, applied in the field of neural network processors, can solve problems such as large on-chip memory access bandwidth, increased memory access power consumption, and increased processor bandwidth design requirements, so as to improve computing efficiency and reduce bandwidth requirements. Effect

Active Publication Date: 2018-01-12
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF5 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, since the neural network processor is a computationally intensive and memory-intensive processor, on the one hand, the neural network model includes a large number of multiplication and addition operations and other nonlinear operations, which require the neural network processor to maintain high load operation to ensure the neural network The computing requirements of the network model; on the other hand, there are a large number of parameter iterations in the neural network computing process, and the computing unit needs to access a lot of memory, which greatly increases the bandwidth design requirements of the processor and increases memory access power consumption.
[0005] Therefore, it is necessary to improve the existing neural network processors to solve the problems of high computing circuit hardware overhead and large on-chip memory access bandwidth.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network computing device and processor comprising same
  • Neural network computing device and processor comprising same
  • Neural network computing device and processor comprising same

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to make the purpose, technical solution, design method and advantages of the present invention clearer, the present invention will be further described in detail through specific embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0030] Such as figure 1 The general topological diagram of the neural network in the prior art is shown. The neural network is a mathematical model formed by modeling the structure and behavior of the human brain. It is usually divided into structures such as an input layer, a hidden layer, and an output layer. Each layer is It is composed of multiple neuron nodes, and the output value of the neuron nodes in this layer (referred to as neuron data or node value in this paper) will be passed as input to the neuron nodes in the next layer, and connected layer by layer. The neur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network computing device and a processor comprising the same, wherein the computing device comprises a pulsating array processing unit and a main processor. The main processor is used for controlling the loading of computational elements in a neural network to the pulsating array processing unit and the transmitting in the pulsating array processing unit; the pulsating array processing unit is composed of a plurality of processing units; each processing unit performs calculations on received computational elements and / or passes the received computational elements to the next processing unit, wherein the computational elements include neuron data and corresponding weight values. The computing device of the invention is capable of speeding up the computational speed of the neural network and reducing the demand for bandwidth in the computing process.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a neural network computing device and a neural network processor including the computing device. Background technique [0002] Deep learning is an important branch of the field of machine learning, which has made major breakthroughs in recent years. The neural network model trained with deep learning algorithms has achieved remarkable results in image recognition, speech processing, intelligent robots and other application fields. [0003] The deep neural network simulates the neural connection structure of the human brain by building a model, and describes the data features hierarchically through multiple transformation stages when processing signals such as images, sounds, and texts. With the continuous improvement of the complexity of the neural network, the neural network technology has many problems in the actual application process, such as occupying a lot ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/06
Inventor 韩银和许浩博王颖
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products