Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Intelligent compression storage method and system for neural network check point data

A neural network and intelligent compression technology, applied in neural learning methods, biological neural network models, electrical components, etc., can solve problems such as insufficient equipment life, achieve the goals of improving storage capacity and life, efficient compression, and reducing data writing Effect

Active Publication Date: 2021-10-08
ZHEJIANG UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to solve the problem of insufficient device life caused by a large number of checkpoint write operations, the present invention proposes an intelligent compression method and system for checkpoint data, which compresses different types of floating-point data in checkpoint files by designing different mechanisms , to reduce the amount of data written to the persistent device and alleviate the life loss of the device

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Intelligent compression storage method and system for neural network check point data
  • Intelligent compression storage method and system for neural network check point data
  • Intelligent compression storage method and system for neural network check point data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The present invention proposes an intelligent compression storage method for neural network checkpoint data, specifically:

[0038] Use the incremental compression method to compress and store the weight floating-point number data after each round of neural network training; and / or use index value mapping to replace part or all of the first n bits of optimizer floating-point number data after each round of neural network training to store. Wherein, the number of digits of the index value is less than n.

[0039] The present invention will be further described below in conjunction with specific embodiment according to accompanying drawing:

[0040] Intelligent compression method of the present invention specifically comprises the following steps:

[0041] Step 1: After a round of deep learning training is over, the system will open up a new space in the GPU memory, and copy the weight floating-point data and optimizer floating-point data to this area.

[0042] Step 2:...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an intelligent compression storage method for neural network check point data. The neural network check point data comprises weight floating-point number data and optimizer floating-point number data. The method specifically comprises the following steps: compressing and storing weight floating-point number data after each round of training of a neural network by using an incremental compression method; and / or utilizing index value mapping to replace the first n bits of part or all optimizer floating-point number data after each round of training of the neural network and then storing, wherein the number of bits of the index value is smaller than n. The intelligent compression method and system are designed by utilizing the characteristics of deep learning check point data and combining a model training process, and different types of data of the check points are efficiently compressed, so that the storage capacity of a storage system is effectively improved, and the service life of the storage system is effectively prolonged.

Description

technical field [0001] The invention relates to the field of computer science and artificial intelligence, in particular to an intelligent compression method and system for neural network checkpoint data. Background technique [0002] The innovation of deep learning technology has greatly promoted the development of computer vision, natural language processing, medicine and other fields, and has received great attention in both academia and industry. In order to obtain high accuracy, deep learning models need to be trained for a long time for iterative updates. During the model training process, a large number of parameters will be generated (some models can reach the data volume of GB or even TB level), and once the training process crashes, the task needs to be restarted. In order to prevent data loss, the parameters trained by the deep learning model need to be periodically stored in high-speed persistent devices (including SSD and non-volatile memory) in the form of che...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08H03M7/30
CPCG06N3/082H03M7/30Y02D10/00
Inventor 何水兵陈平洪佩怡张寅陈刚
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products