Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep learning oriented sparse self-adaptive neural network, algorithm and implementation device

A neural network algorithm and deep learning technology, applied in biological neural network models, physical implementation, etc., can solve problems such as loss of accuracy, and achieve the effect of saving storage requirements

Inactive Publication Date: 2016-04-13
CHONGQING UNIV
View PDF0 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, using traditional methods to quantify floating-point data and directly using fixed-point calculations often leads to loss of accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning oriented sparse self-adaptive neural network, algorithm and implementation device
  • Deep learning oriented sparse self-adaptive neural network, algorithm and implementation device
  • Deep learning oriented sparse self-adaptive neural network, algorithm and implementation device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] Generally, a traditional artificial neural network includes a visible layer with a certain number of input nodes and a hidden layer with a certain number of output nodes. Some designs use a label layer in the highest network layer, which is also an optional but not required part of the present invention. The nodes of a hidden layer are connected to the input nodes of the visible layer by weights. Note that when there are two or more hidden layers, the previous hidden layer is connected to another hidden layer. Once the hidden layer of the low-level network is trained, for the high-level network, the hidden layer is the visible layer of the high-level network.

[0039] figure 1 It is a schematic diagram of the classic DBN model. In the DBN network, the parameters describing the connection are dense real numbers. The calculation of each layer is the matrix multiplication between the interconnected units and their excitations. A large number of floating-point data multip...

Embodiment 2

[0077] Early sparse DBN research only focused on extracting sparse features rather than using sparse connections to generate efficient network architectures for hardware models; recent neuromorphic hardware models for deep learning have an increasing number of neurons on a chip, but integrated on a chip A million neurons and a billion synapses are still no small challenge. Figure 4 A deep learning-oriented sparse adaptive neural network optimization and implementation device is shown, and its MAP table and TABLE table are obtained by the DAN sparse algorithm described in the present invention.

[0078] The specific workflow is as follows:

[0079] 1) Detect whether the input bit axon[i] is 1: if it is 1, that is, a synaptic event arrives, then access the corresponding position in the MAP list according to the value of i, if it is 0, then detect the next input bit.

[0080] 2) Read out the corresponding start address and length value in the MAP, if the length value is not 0, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep learning oriented sparse self-adaptive neural network. The deep learning oriented sparse self-adaptive neural network comprises at least one layer of self-adaptive limited Boltzmann machine; the at least one layer of self-adaptive limited Boltzmann machine comprises a visible layer and a hidden layer; and the visible layer and the hidden layer are sparsely connected. In the neural network disclosed by the invention, the visible layer and the hidden layer are sparsely connected; simultaneously, one connection represented by a 32-bit real number is optimized into one connection represented by a 1-bit integer; due to the optimization manner, mode recognition is not influenced; furthermore, precision requirements can also be satisfied; and the large-scale neural network can be realized on a single chip only in need of fixed-point arithmetic and a small amount of multiplication.

Description

technical field [0001] The invention relates to the field of integrated circuit / neural network / big data computing, and in particular to the field of model construction and optimization of on-chip deep self-adaptive neural network. Background technique [0002] In this technical field, the realization of the neural network model, the existing technology is mainly divided into software implementation and hardware implementation. [0003] Software implementation: usually based on a general-purpose processing unit (CPU) or a general-purpose graphics processing unit (GPGPU) based on the von Neumann architecture to run specific neural network algorithms. In neural network models, such as the classic DBN model, the connection between neurons needs to be realized by a matrix storing weight values. As the number of neurons increases, the size of the weight matrix will be O(n 2 ) grows explosively, which means that a lot of storage resources (such as memory) need to be consumed. Li...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/02G06N3/06
CPCG06N3/02G06N3/061
Inventor 周喜川李胜力余磊李坤平赵昕杨帆谭跃唐枋胡盛东甘平
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products