Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic extensible convolutional neural network accelerator

A convolutional neural network and accelerator technology, applied in the fields of computing, calculation, and counting, which can solve problems such as reducing bandwidth, reducing neural network operation delay, and low latency.

Inactive Publication Date: 2020-01-17
SOUTHEAST UNIV
View PDF3 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to address the shortcomings of the above-mentioned background technology, and propose a dynamically scalable convolutional neural network accelerator to improve the computing efficiency of the convolutional neural network, realize data multiplexing, reduce neural network operation delay, and reduce bandwidth. Solve the technical problem that the existing neural network accelerators cannot meet the task requirements with low latency and less occupied bandwidth in high-performance application scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic extensible convolutional neural network accelerator
  • Dynamic extensible convolutional neural network accelerator
  • Dynamic extensible convolutional neural network accelerator

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention is further illustrated below in conjunction with specific embodiments, should be understood that these embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention, after having read the present invention, those skilled in the art will understand the various equivalent forms of the present invention All modifications fall within the scope defined by the appended claims of this application.

[0035] Such as figure 1 As shown, the computing array of the Convolutional Neuron Processing Unit (CNPU) adopts a heterogeneous design. The CNPU calculation subsystem includes the calculation array MA-NPEA based on the multiply-accumulate circuit, the calculation array LUT-NPEA based on the look-up table multiplier, and the shared memory between the arrays (Shared Memory). Two of each computing array. MA-NPEA consists of basic circuits such as approximate multipliers and approximate adders, and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dynamic extensible convolutional neural network accelerator, and belongs to the technical field of calculation, calculation and counting. The accelerator comprises an adaptive data storage module and an efficient computing array scheduling module. The adaptive data storage module comprises a hierarchical storage module and an external two-dimensional data conversion customization interface module. The efficient computing array scheduling module comprises a neuron processing unit array data scheduling module based on multiply-add logic and a neuron processing unit array data scheduling module based on a lookup table. The whole convolutional neural network accelerator masks the data delay of external memory access by designing a reasonable multi-level storage structure. Data scheduling is carried out on the computing array according to network layer characteristics and task requirements, so that repeated utilization of the data can be realized, the access parallelism degree of a lookup table of the computing array is improved, the operation speed is increased, and the accelerator can adapt to various complex computing tasks.

Description

technical field [0001] The invention discloses a dynamically expandable convolutional neural network accelerator, relates to the physical realization of neural networks, and belongs to the technical field of calculation, calculation and counting. Background technique [0002] With the advancement of computer and communication technology, Internet data has exploded. The processing of massive data has become a great challenge, and traditional methods can no longer cope with the current situation. Deep learning has emerged as a viable approach for big data processing. Deep learning is an important part of artificial intelligence, which can more realistically simulate the working mechanism of the human brain to achieve better results. Among them, the convolutional neural network in deep learning has made remarkable achievements in the field of image processing. [0003] Convolutional neural networks can be deployed on the cloud, but localized operation of convolutional neural...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06F15/167G06F12/0897G06F12/0862G06F12/0868G06F9/50G06F13/18
CPCG06F9/5027G06F12/0862G06F12/0868G06F12/0897G06F13/18G06F15/167G06F2212/1024G06N3/063
Inventor 刘波李焱黄乐朋孙煜昊沈泽昱杨军
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products