Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network operation system, method, device and storage medium

A neural network and computing system technology, applied in biological neural network models, memory systems, computing, etc., can solve problems such as reducing processor computing efficiency and wasting resources

Active Publication Date: 2021-05-28
GREE ELECTRIC APPLIANCES INC
View PDF7 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the deep neural network is calculated by layer, the input and output data of each layer is large, so a considerable amount of memory needs to be provided to store intermediate data and weight data, resulting in a large amount of resources being wasted in data handling between cache and memory. In the process, the storage wall problem occurs, which reduces the computing efficiency of the processor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network operation system, method, device and storage medium
  • Neural network operation system, method, device and storage medium
  • Neural network operation system, method, device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] In order to make the purpose, technical solutions and advantages of the application clearer, the technical solutions in the embodiments of the application will be clearly and completely described below in conjunction with the drawings in the embodiments of the application. Obviously, the described embodiments are only It is a part of the embodiments of this application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application. In the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined arbitrarily with each other. Also, although a logical order is shown in the flowcharts, in some cases the steps shown or described may be performed in an order different from that shown or described herein.

[0053] The terms "first" and "second" in the specificatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network operation system, a method, a device and a storage medium, which are used for reducing data migration in a neural network calculation process and improving the operation efficiency of a neural network processor. The neural network operation system comprises at least two neural network processing units, a first storage unit and a second storage unit, and the first storage unit is used for storing input data and output data of a neural network and operation parameters required by operation of each layer of neural network; the second storage unit is used for providing input caches and output caches for each neural network unit in the at least two neural network processing units, and each neural network processing unit comprises two input caches and two output caches, two output caches of one neural network processing unit in two adjacent neural network processing units are two input caches of the other neural network processing unit; the at least two neural network processing units are annularly connected.

Description

technical field [0001] The present invention relates to a processor, in particular to a neural network computing system, method, device and storage medium. Background technique [0002] In the process of running the neural network model, the neural network processor (Neural network Processing Unit, NPU) can provide sufficient computing power for the training and reasoning of the deep neural network. Since the deep neural network is calculated by layer, the input and output data of each layer is large, so a considerable amount of memory needs to be provided to store intermediate data and weight data, resulting in a large amount of resources being wasted in data handling between cache and memory. In this process, a storage wall problem occurs, which reduces the computing efficiency of the processor. Contents of the invention [0003] Embodiments of the present application provide a neural network computing system, method, device, and storage medium, which are used to reduce...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/0842G06F12/0811G06F12/084G06N3/063
CPCG06F12/0842G06F12/0811G06F12/084G06N3/063G06F2212/1044G06F2212/1056Y02D10/00
Inventor 刘文峰
Owner GREE ELECTRIC APPLIANCES INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products