Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Memory allocation method of neural network

A technology of memory allocation and neural network, which is applied in the fields of computer and artificial intelligence, can solve problems such as unsuitable use and labor time, and achieve the effects of complete automation, reduced memory size, and convenient use

Active Publication Date: 2020-10-23
HANGZHOU NATCHIP SCI & TECH
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method can improve the memory utilization rate, but it will take a lot of manpower time, so it is not suitable for use in actual projects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory allocation method of neural network
  • Memory allocation method of neural network
  • Memory allocation method of neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The technical solutions of the present invention will be further described below in conjunction with the accompanying drawings and embodiments. It should be noted that the drawings are only used for illustration and should not be construed as limiting the patent. Meanwhile, the present invention can be implemented in various forms and should not be limited by the embodiments set forth herein. The following embodiments are provided to make the present invention easier to understand and more completely presented to those skilled in the art.

[0043] Such as figure 1 A neural network memory allocation method is shown, specifically:

[0044] S1. Obtain the calculation units in the calculation graph, and number each calculation unit in turn according to the calculation order; the details are as follows:

[0045] S11. Traversing the neural network calculation graph, removing the operation unit whose data storage in the memory of the input tensor and the output tensor is ex...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a memory allocation method of a neural network. A traditional dynamic memory allocation method has great waste, and a manual memory allocation method needs to spend much labortime. The method comprises the steps that firstly, calculation units in a calculation graph are obtained, and all the calculation units are numbered in sequence according to the calculation sequence;obtaining a calculation number set of memory reusable tensors of all calculation units in the model; and determining a final memory allocation mode of the reusable tensors of the memory, and obtainingthe total size of the reusable memory required by the model and the allocated memory address of each reusable tensor of the memory. According to the method, memory fragments generated when the neuralnetwork model applies for and releases the memory can be effectively reduced, the total memory size required by the neural network model is reduced, and the method can be conveniently used in actualengineering.

Description

technical field [0001] The invention belongs to the technical field of computers, in particular to the technical field of artificial intelligence, and specifically relates to a neural network memory allocation method. Background technique [0002] Artificial intelligence has developed rapidly in recent years, and deep learning and neural networks are the basis for the development of artificial intelligence. Since the neural network often has many layers and a large tensor size, it will consume more memory. Moreover, in recent years, the demand for deploying neural networks on embedded devices has become stronger and stronger. Therefore, the optimization of memory allocation is very important. [0003] A memory optimization method is to adopt a traditional dynamic memory allocation method, such as the memory allocation of the malloc function in the C language standard library. However, this dynamic allocation method does not allocate memory from a more global perspective, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06F9/50G06F12/02
CPCG06N3/063G06F9/5016G06F12/0246
Inventor 郑迪任俊林刘祥有凌云
Owner HANGZHOU NATCHIP SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products