Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Memory management method based on deep learning network

A deep learning network and memory management technology, applied in neural learning methods, biological neural network models, electrical digital data processing, etc. Achieve the effect of improving speed and performance, saving memory and saving resources

Pending Publication Date: 2021-10-22
杭州英歌智达科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] 1. The network architecture cannot use different processing units (such as DSP and GPU) to run different layers of the same network at the same time
[0004] 2. On embedded devices, the use of memory is very limited
In the past network (such as NCNN, MACE) operation, the memory allocation is invisible to the user, the user can only view and monitor the memory usage of the network through the system's memory detection system, and there is an unpredictable memory allocation possibility, causing network instability on embedded devices
[0005] 3. In the previous network framework, the utilization rate of GPU is uncontrollable, and all network architectures will maximize the use of GPU
When multiple networks call the GPU at the same time, there will be competition for resources, and the running time of each network will cause uncertainty

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory management method based on deep learning network
  • Memory management method based on deep learning network
  • Memory management method based on deep learning network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] According to attached figure 1 As shown, a memory management method based on a deep learning network in an embodiment of the present invention includes:

[0030] S01, load network model parameters;

[0031] S03, load the network model;

[0032] S05, receiving data input by the user for network operation;

[0033] S07, comparing the network input size with the default network input size or the last network input size;

[0034] S09, if the network input size is large, recalculate the temporary memory of each layer and the transfer memory between layers, and re-allocate;

[0035] S11, the network operates according to the operating equipment and parameters set by the user.

[0036] Further, the above method also includes: if the network input size is large, the memory size remains unchanged.

[0037] Further, the above method also includes: setting the processing unit of the model network according to the user, and the setting method includes any one of the following:...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a memory management method based on a deep learning network. The method comprises the following steps: loading network model parameters; loading a network model; receiving network operation data input by a user; comparing the network input size with a default network input size or a last network input size; if the network input size is large, recalculating the temporary memory of each layer and the inter-layer transfer memory, and redistributing the temporary memory and the inter-layer transfer memory; and enabling the network to operate according to the operation equipment and parameters set by the user.

Description

technical field [0001] The invention belongs to the field of deep learning, and in particular relates to memory management based on a deep learning network. Background technique [0002] With the rapid development of deep learning in recent years, products in computer vision, data processing, etc. are increasing day by day, and chips that can deploy deep learning are also diversified. The existing deep learning deployment framework has the following problems: [0003] 1. The network architecture cannot use different processing units (such as DSP and GPU) to run different layers of the same network at the same time. [0004] 2. On embedded devices, the use of memory is very limited. In the past network (such as NCNN, MACE) operation, the memory allocation is invisible to the user, the user can only view and monitor the memory usage of the network through the system's memory detection system, and there is an unpredictable memory allocation Possibility, causing network insta...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08G06F9/50
CPCG06N3/08G06F9/5016G06N3/045
Inventor 罗涛
Owner 杭州英歌智达科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products