Memory resource static deployment system and method

A memory resource and deployment system technology, applied in the field of memory resource static deployment system, can solve problems such as affecting model learning efficiency, insufficient memory resources, increasing thread control complexity, etc., to eliminate memory crashes, achieve computing efficiency, and data pipeline interval. minimized effect

Active Publication Date: 2020-04-03
BEIJING ONEFLOW TECH CO LTD
View PDF8 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the large number of parameter interactions between each other in the process of data parallelism or model parallelism, because the schedulers of each thread run independently, they cannot know each other's running status, which will cause some threads Insufficient memory resources may occur during operation
If each independent thread interacts with each other's state, this will increase the complexity of each thread control, which will lead to a very high order of interaction overhead, which will seriously affect the learning efficiency of the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory resource static deployment system and method
  • Memory resource static deployment system and method
  • Memory resource static deployment system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present disclosure will be described in further detail below in conjunction with the embodiments and drawings, so that those skilled in the art can implement it with reference to the description.

[0028] Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatuses and methods consistent with aspects of the present disclosure as recited in the appended claims.

[0029] The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to limit the present disclosure. As used in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a memory resource static deployment system, comprising a memory resource obtaining component which is used for obtaining the memory resource quantity of each computing device, alife cycle acquisition component which is used for acquiring the life cycle of each logic output cache of each task node from the start of written data to the overwriting of the data on the basis ofall topological paths to which each task node of the task relationship topological graph to be deployed on the plurality of computing devices belongs, a minimum data pipeline interval searching component which is used for searching a minimum data pipeline interval suitable for the system between 0 and the maximum floating point operation value, and a memory pool number allocation component which is used for rounding up the quotient obtained by dividing the acquired life cycle of each logic output cache by the minimum data flow interval to obtain an integer so as to allocate the memory pools ofthe integer parts to the corresponding logic output caches.

Description

technical field [0001] The present disclosure relates to a system for statically deploying memory resources of a data processing device and a method thereof, and more specifically, the present disclosure relates to a system and method for statically deploying memory resources of a statically distributed stream data processing device. Background technique [0002] With the development of machine learning and the gradual deepening of artificial neural network research, the concept of deep learning has been widely concerned and applied. Since the data processing performed by deep learning is mostly to perform a large number of repetitive calculations, people often use co-processors, such as GPU, TPU and CPU, to perform this deep learning data processing. [0003] Although the existing parallel processing methods have improved the data processing speed, the speed of data transfer between devices has not been qualitatively improved compared with the traditional CPU cluster. For e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50
CPCG06F9/5016
Inventor 李新奇柳俊丞成诚袁进辉
Owner BEIJING ONEFLOW TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products