Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Computing platform implementation method and system for neural network

A computing platform and neural network technology, applied in the field of deep learning, can solve problems such as short execution time, achieve the effects of eliminating bottlenecks, avoiding bandwidth occupation, and improving computing efficiency

Inactive Publication Date: 2019-10-11
XILINX TECH BEIJING LTD
View PDF8 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] When using a highly parallel computing platform including FPGA and GPU to perform neural network inference, compared with the time cost of reading the parameters required for the operation, the execution time required for the calculation is very short, which leads to memory reading becoming Bottleneck for increasing processing speed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computing platform implementation method and system for neural network
  • Computing platform implementation method and system for neural network
  • Computing platform implementation method and system for neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.

[0039] Basic concepts of neural network processors

[0040]With the continuous development of artificial intelligence, machine learning and neural network algorithms in recent years, convolutional neural networks have achieved superhuman effects in image classification, recognition, detection and tracking. Due to the huge parameter scale and huge amount of calculation of the convolutional neural network, as well as the requirements for hard...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a computing platform implementation method for a neural network. The computing platform reads required data from an external memory, caches the read data and the intermediate calculation result of each operation in an on-chip cache; the method comprises the following steps: reading a first part of data of a feature map required by a first operation from the external memory,executing the first operation and at least one other operation for the first part of data, and storing an operation result for the first part of data back to the external memory; reading a second part of data of the feature map required by the first operation from the external memory, executing the reading operation from the external memory, the first operation and the at least one other operation for the second part of data, and storing the operations back to the external memory. By means of the scheme, the number of times of data carrying between off-chip storage and on-chip cache can be greatly reduced, and therefore the overall processing speed is increased.

Description

technical field [0001] The present invention relates to the field of deep learning, in particular to a calculation platform implementation method for neural networks and a related system. Background technique [0002] Neural Network (Neural Network) has become a research hotspot in the field of image recognition in recent years. The trained neural network model can be used in many fields such as image classification, object recognition and saliency detection. In recent years, the neural network model has shown a trend of increasing computing scale and increasing complexity. The use of traditional CPU platforms has been unable to meet its practical requirements. Therefore, using heterogeneous computing platforms such as FPGA and GPU to design neural network accelerators has become a new research hotspot. Among them, due to its low power consumption characteristics, FPGA can obtain higher energy efficiency compared with GPU platform. At the same time, FPGA can quickly iterat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/06
CPCG06F3/0613G06F3/0655G06F3/0656G06F3/068
Inventor 隋凌志王雨顺刘鑫
Owner XILINX TECH BEIJING LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products