Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

On-chip neural network-oriented synaptic implementation architecture

A neural network and synaptic technology, applied in the field of synaptic implementation architecture based on three-level index and storage compression sharing, can solve the problem of a large number of connections

Active Publication Date: 2021-05-11
ZHEJIANG LAB +1
View PDF11 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For the network topology of millions of neurons, the number of connections is huge, which is also a challenge for chips with limited resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • On-chip neural network-oriented synaptic implementation architecture
  • On-chip neural network-oriented synaptic implementation architecture
  • On-chip neural network-oriented synaptic implementation architecture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] In order to make the object, technical solution and technical effect of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0018] Such as figure 1 and figure 2 As shown, a synapse implementation architecture based on three-level index and storage compression sharing for on-chip neural network is specifically: through the synapse of the three-level index structure mode 1, the pre-synaptic neuron 4 and the post-synaptic neuron are connected 7. The connection information and weights of the three-level index structure can be stored in the synaptic storage device, or allocated to the pre-synaptic neuron 4 and the post-synaptic neuron 7 according to requirements, so as to optimize synaptic resources and benefit the on-chip neural network. Network implementation. The synaptic storage device adopts the compressed storage method 2 of shared weight and the storage space...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of neural synapse implementation of brain-like computing chips, and relates to an on-chip neural network-oriented synapse implementation architecture based on three-level index and storage compression sharing. Synapses are connected with pre-synaptic neurons and post-synaptic neurons through a three-level index structure mode; connection information and weights in the three-level index structure mode can be stored in a synaptic storage device or distributed to pre-synaptic neurons and post-synaptic neurons, and the synaptic storage device adopts a weight sharing compression storage mode and a scene-based storage space sharing mechanism. Flexible configuration of the neural network topology structure can be supported, and the storage space of the neural network topology structure can be efficiently utilized.

Description

technical field [0001] The invention belongs to the technical field of neural synapse realization of brain-like computing chips, and relates to a synaptic realization framework based on three-level index and storage compression sharing for on-chip neural network. Background technique [0002] In recent years, the effects of "memory wall" and "power consumption wall" have become increasingly serious, and the von Neumann architecture followed by traditional computers is facing huge challenges. In the post-Moore era, the semiconductor industry urgently needs to find new architectures and methods to meet the needs of the electronics industry for ever-increasing computing performance and extremely low power consumption. With the development of brain science, people gradually understand that the human brain is a computer with extremely high energy efficiency, and brain-like computing came into being. The combination of memory and computing unit fundamentally eliminates the "memor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/06G06N3/063G06F15/78
CPCG06N3/061G06N3/063G06F15/7817
Inventor 孙世春金孝飞陆启明章明朱国权郝康利韩佩卿凡军海马德朱晓雷潘纲
Owner ZHEJIANG LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products