Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network accelerator heat effect optimization method based on memristor cross array

A technology of neural network and optimization method, which is applied in the field of thermal effect optimization of memristor-based cross-array neural network accelerators, can solve problems such as reduction and reduction of calculation accuracy, and achieve the effects of improving accuracy, reducing influence, and shortening operation time

Pending Publication Date: 2022-01-28
WUHAN UNIV OF TECH
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to overcome the problem in the prior art that the calculation accuracy of the thermal effect of the memristor is reduced, and to provide a thermal effect optimization method based on a memristor cross-array neural network accelerator that reduces conductance and improves accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network accelerator heat effect optimization method based on memristor cross array
  • Neural network accelerator heat effect optimization method based on memristor cross array
  • Neural network accelerator heat effect optimization method based on memristor cross array

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0074] A method for optimizing the thermal effect of a neural network accelerator based on a memristor cross array, comprising the following steps:

[0075] Step 1, establish a fast temperature distribution calculation model:

[0076] First, customize the input data, define these data as the power value P to obtain the actual power matrix, select the pulse power of a point (x, y) in the actual power matrix, divide the pulse power by the volume to obtain the value of the pulse heat source, and then divide the pulse heat source Input the value into ANSYS software to obtain a pulse temperature matrix through its finite element calculation function, and finally perform convolution operation on the obtained pulse temperature matrix and actual power matrix, and then divide by the pulse power to obtain the actual temperature distribution matrix;

[0077] Step 2: Establish the MLP neural network failure assessment model: apply the distribution matrix of the actual temperature T obtain...

Embodiment 2

[0081] Embodiment 2 is basically the same as Embodiment 1, and its difference is:

[0082] In said step one, in establishing a fast temperature distribution calculation model, the specific establishment steps are as follows:

[0083] First, customize the input data, define these data as the power value P to obtain the actual power matrix, and use the analytical calculation method to obtain the accurate global temperature distribution of the memristor neural network accelerator, while ensuring the maximum temperature distribution compared with the traditional numerical calculation method. The error is not more than 5%; the following temperature distribution fast calculation model is established, as shown in formula (1):

[0084]

[0085] where T δ (x, y, τ) represents the temperature value at position (x, y) in the memristor-based neural network accelerator at time τ under the thermal action of the pulse signal, the power matrix P is a known quantity, and P(x, y, τ) repres...

Embodiment 3

[0101] Embodiment 3 is basically the same as Embodiment 2, and its difference is:

[0102] The neural network accelerator thermal effect method based on the memristor cross array of the present invention comprises the following steps:

[0103] In the model part, the "power fuzzy" fast temperature calculation method is used to quickly establish a temperature distribution calculation model. The main ideas are as follows: the relationship between heat and temperature distribution is processed by a linear signal system. Heat is used as an input value, and the corresponding temperature distribution is output in response. The basic principle of signals and systems shows that the output of a linear system can be regarded as a time-domain convolution process between the input and the impulse signal response. Because the traditional method calculates all the heat sources through fine numerical calculations, it will take too long, but the method used in the model proposed by the presen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A neural network accelerator heat effect optimization method based on a memristor cross array comprises the following steps: 1, establishing a rapid temperature distribution calculation model: selecting pulse power of one point in an actual power matrix to divide the pulse power by the volume to obtain a pulse heat source value, inputting the pulse heat source value into ANSYS software to obtain a pulse temperature matrix, and carrying out convolution on the obtained pulse temperature matrix and the actual power matrix to be divided by the pulse power to obtain an actual temperature distribution matrix; 2, establishing an MLP neural network failure evaluation model: applying a distribution matrix of the actual temperature T obtained by the rapid temperature distribution calculation model to the MLP neural network failure evaluation model to obtain the influence of the actual temperature T on a weight value in the MLP neural network model; and 3, carrying out MLP neural network model mapping of off-line thermal optimization. According to the design, the arrangement of the memristor array is optimized, the influence of temperature rise on the memristors is reduced, and the precision of the offline training process of the neural network is improved.

Description

technical field [0001] The invention relates to a method for optimizing the thermal effect of a memristor-based cross-array neural network accelerator, which is particularly suitable for reducing the thermal effect of the memristor and improving the calculation accuracy of the memristor. Background technique [0002] In recent years, with the in-depth research of neural network, its scale has become larger and larger, which has brought great challenges to the computing system of neural network models, and also brought challenges to the traditional von Neumann architecture. The memristor can perform data processing and storage functions in the same device unit, and is expected to realize the integrated structure of storage and computing, so it has attracted widespread attention. Memristor is the fourth basic unit in circuit components. It has high integration and can build large-scale cross-array structure. Memristor array has analog characteristics. It is based on Ohm's law ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F30/337G06F30/3308G06N3/063G06N3/04G06F119/02G06F119/06G06F119/08
CPCG06F30/337G06F30/3308G06N3/063G06F2119/02G06F2119/06G06F2119/08G06N3/045Y02D10/00
Inventor 徐宁商梦君
Owner WUHAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products