Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Compression device for deep neural network

A deep neural network and compressor technology, applied in biological neural network model, neural architecture, physical implementation, etc., can solve problems such as on-chip network burden, increase in on-chip network transmission delay, and affect system performance, and achieve the effect of reducing delay.

Active Publication Date: 2018-01-16
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF3 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the huge data transmission demand will bring a huge burden to the network on chip, so that the transmission delay of the network on chip will increase significantly, which will affect the system performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Compression device for deep neural network
  • Compression device for deep neural network
  • Compression device for deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0048] figure 2 A schematic structural diagram of a 3D memory-based deep neural network acceleration system in the prior art is shown. Such as figure 2 As shown, a deep neural network computing unit is integrated on the logical layer of the 3D memory, and the deep neural network computing unit is connected to a local vault containing the memory controller through a memory controller. The memory controllers of different Vaults transmit data through the common on-chip network, and the memory controllers of the local Vault realize the data routing with the remote Vault through the router of the on-chip network.

[0049] refer to figure 2 , when the deep neural network computing unit starts to perform neural network calculations, it needs to send the data request to the corresponding connected local Vault memory controller, if the location ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an acceleration system for a deep neural network. The acceleration system for a deep neural network includes a 3D memory, a deep neural network computing unit which is connectedwith memory controllers on logic layers of vaults of the 3D memory, a router which is connected with the memory controllers, a compressor and a decompression device, wherein the memory controller ofeach vault performs data transmission through the router connected with the memory controllers through network on chip; the compressor is used for compressing the data to be compressed, which needs tobe transmitted on the network on chip and is applied to the deep neural network; and the decompression device is used for decompressing the data to be decompressed which comes from the network on chip and is applied to the deep neural network.

Description

technical field [0001] The invention relates to the acceleration of the deep neural network, in particular to the data processing of the deep neural network. Background technique [0002] With the development of artificial intelligence technology, technologies involving deep neural networks, especially convolutional neural networks, have developed rapidly in recent years. In image recognition, speech recognition, natural language understanding, weather prediction, gene expression, content recommendation It has been widely used in fields such as artificial intelligence and intelligent robots. The deep neural network can be understood as an operation model, which contains a large number of data nodes, each data node is connected to other data nodes, and the connection relationship between each node is represented by weight. As deep neural networks continue to develop, so do their complexity. Since calculations using deep neural networks often require cyclic operations on a l...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/063
CPCG06N3/04G06N3/063
Inventor 翁凯衡韩银和王颖
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products