Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

FPGA-based space-time diagram neural network accelerator structure

A neural network and accelerator technology, applied in the field of space-time graph neural network accelerator structure, can solve problems such as difficult performance, difficult to adapt to network models including various computing styles, complex computing features, etc., to achieve efficient processing, strong universality, and simplicity way effect

Pending Publication Date: 2022-05-27
国家超级计算深圳中心(深圳云计算中心)
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, the existing technology generally accelerates a certain neural network structure, and it is difficult to adapt to network models containing multiple computing styles.
The space-time graph neural network is a model that includes both the graph convolutional neural network GCN and the recurrent neural network GRU. Its calculation functions include dense matrix operations, sparse matrix operations, element-by-element multiplication and addition, and integration of various activation functions. Its complex computational characteristics make it difficult to achieve good performance on accelerators of existing architectures

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • FPGA-based space-time diagram neural network accelerator structure
  • FPGA-based space-time diagram neural network accelerator structure
  • FPGA-based space-time diagram neural network accelerator structure

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] In order to solve the defect that the existing technology can only accelerate for a certain neural network structure, and it is difficult to adapt to the network model containing multiple computing styles, the present invention designs a spatiotemporal graph neural network accelerator structure based on FPGA. The parallel processing method of tensor and vector dual acceleration modules, the vector acceleration module performs sparse matrix multiplication or performs element-wise multiplication and addition operations, and the tensor acceleration module performs dense matrix multiplication, bias term addition, and operations of different activation functions. Moreover, the system control module controls the tensor acceleration module and the vector acceleration module to complete the calculation corresponding to the calculation sequence according to the set calculation sequence, and the data flow control module controls the data between the tensor acceleration module and t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a space-time diagram neural network accelerator structure based on an FPGA, a tensor and vector double acceleration module parallel processing mode is designed, a vector acceleration module executes sparse matrix multiplication or element-by-element multiplication and addition operation, and a tensor acceleration module executes dense matrix multiplication, bias term addition and operation of different activation functions. The system control module controls the tensor acceleration module and the vector acceleration module to complete calculation corresponding to a calculation time sequence according to the set calculation time sequence, and the data flow control module controls the data flow direction between the tensor acceleration module and the vector acceleration module so as to complete calculation in a circulating mode. According to the design of the whole architecture, efficient processing of a three-dimensional space-time diagram neural network is achieved, and calculation of a multifunctional function is achieved in a simple mode; the acceleration calculation of multiple neural network modes can be realized, so that the method can be downwards compatible with the acceleration of multiple network models such as a graph convolutional neural network GCN and a gated recursive unit GRU, and has higher universality.

Description

technical field [0001] The invention relates to the field of spatiotemporal graph data processing, in particular to an FPGA-based spatiotemporal graph neural network accelerator structure. Background technique [0002] A spatiotemporal graph is a three-dimensional network data structure with time and space, which is a kind of dynamic graph. It is widely present in graph data containing temporal information such as social networks and sensor networks. Unlike static graphs where the information of each node and connection does not change over time, dynamic graphs have an additional time dimension, and its nodes and connections may change over time. The spatiotemporal graph has important application and research value in recommendation systems, traffic forecasting, and epidemic spread forecasting. With the continuous growth of the amount of data, artificial intelligence (AI) algorithms have achieved great development and are widely used in many fields such as image, voice, an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06N3/04G06N3/08
CPCG06N3/063G06N3/049G06N3/08G06N3/045Y04S10/50
Inventor 靳超黄典冯圣中
Owner 国家超级计算深圳中心(深圳云计算中心)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products