Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache management method and device

A cache management and cache record technology, applied in the computer field, can solve problems such as no effective solution, full disk storage space, and read and write bottlenecks.

Active Publication Date: 2019-11-26
INSPUR SUZHOU INTELLIGENT TECH CO LTD
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are read and write bottlenecks in accessing data through shared storage, and some enterprise clusters will not build and maintain a high-performance shared file system due to technical or cost reasons; on the other hand, downloading before training and deleting after training will cost a lot Part of the time is spent on data download, if it is not deleted after training, it may cause other problems due to full disk storage space
[0004] There is no effective solution to the problem that it is difficult to deal with the training data cache in the existing technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache management method and device
  • Cache management method and device
  • Cache management method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the object, technical solution and advantages of the present invention clearer, the embodiments of the present invention will be further described in detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0041] It should be noted that all expressions using "first" and "second" in the embodiments of the present invention are to distinguish two entities with the same name but different parameters or parameters that are not the same, see "first" and "second" It is only for the convenience of expression, and should not be construed as a limitation on the embodiments of the present invention, which will not be described one by one in the subsequent embodiments.

[0042] Based on the above purpose, the first aspect of the embodiments of the present invention proposes an embodiment of a method capable of managing caches of different training data. figure 1 What is shown is a schematic flowchart of the cache ma...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cache management method and device. The cache management method comprises the steps of generating a cache record file according to existing cache data of data sets of all computing nodes, and determining a cache strategy of each computing node; in response to the fact that the local computing node receives the training task, determining whether a data set required by thetraining task is stored in any computing node according to the cache record file; downloading a data set required by the training task in response to the fact that the local computing node meets the caching strategy requirement; and in response to the fact that the local computing node does not meet the caching strategy requirement, deleting the existing caching data based on the caching record file and re-determining whether the local computing node meets the caching strategy requirement or not. The cache of different training data can be managed, the training data can be selectively downloaded and deleted according to actual scene requirements, the downloading time of the training data is saved, and the availability of computing node disk storage is guaranteed.

Description

technical field [0001] The present invention relates to the field of computers, and more specifically, to a cache management method and device. Background technique [0002] In the process of deep learning model training, the larger the training data set and the more data samples, the easier it is to avoid the overfitting problem of the training model. But at the same time, large-scale data sets also bring challenges to cluster management. For example, the collected data of a video processing can reach tens of gigabytes or even hundreds of gigabytes. As a deep learning cluster shared by multiple users, different users may use the same The data set for model training, or a single user will also use a different data set for training. Due to the limited storage space of computing nodes, these training data cannot be stored in each computing node for users to use at the same time, which will also cause waste of storage space. How to store and use these training data has become ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/17
CPCG06F16/172
Inventor 胡叶
Owner INSPUR SUZHOU INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products