Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Management method by using rapid non-volatile medium as cache

A management method and non-volatile technology, applied in memory systems, electrical digital data processing, memory address/allocation/relocation, etc., can solve problems such as data loss

Active Publication Date: 2012-06-27
DAWNING INFORMATION IND BEIJING +1
View PDF4 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, with volatile devices such as memory, the data cached on high-speed devices will be lost after a system crash or power failure. is unbearable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Management method by using rapid non-volatile medium as cache
  • Management method by using rapid non-volatile medium as cache
  • Management method by using rapid non-volatile medium as cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] The present invention utilizes the Device Mapper mechanism of Linux to manage multiple block devices, and uses the high-speed devices as caches for low-speed devices to build a two-level storage system, thereby obtaining higher storage performance at a lower cost. The overall structure is as follows: figure 1 shown.

[0016] The invention divides the managed devices into cache devices and disk devices, wherein the cache devices use high-performance solid-state hard disks, and in actual use, multiple solid-state hard disks can be used to form a RAID to improve the performance of the cache. Neither the cache device nor the disk device is visible to the user, but the present invention provides the user with pseudo devices with the same number and characteristics as the disk device. The present invention can manage multiple cache devices and disk devices, and the cache device and disk devices have a one-to-many relationship, that is, one cache device can be shared by multip...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a management method by using a rapid non-volatile medium as a cache. According to the method, a solid-state hard disk is used as a caching device, each caching device is shared by multiple disks, and each disk can only use one caching device; the caching device is divided into a region structure, and mapping connection is adopted between the caching devices and the disks; the caching device is divided into two parts, the front part is used for storing the region data structure in a memory, and the back part is a cache. The method provided by the invention adopts a policy that metadata can be written in the solid-state hard disk in real time to solve the problem, and synchronizes the metadata for managing the dirty data in an assigned position of the solid-state hard disk while the dirty data are written in the solid-state hard disk, so that all the data cached in the solid-sate hard disk without being written in the conventional disk can be read only by reading the metadata when the system is restarted, and further all the data can be prevented from being lost when the system breaks down and is in power failure.

Description

technical field [0001] The invention relates to disk cache management technology, in particular to a management strategy for caching data of slow disks in high-speed non-volatile media. Background technique [0002] Disk cache is divided into read cache and write cache. Read cache means that the operating system keeps the file data that has been read in the memory space when the memory is relatively free. When the software or user reads the same file again next time, it does not need to read it from the disk again, thereby improving rate. Write cache actually saves the data to be written to the disk in the memory space allocated by the system for write cache, and when the data saved in the memory pool reaches a certain level, the data is saved to the hard disk. This can reduce the actual disk operations, effectively protect the disk from damage caused by repeated read and write operations, and also reduce the time required for writing. According to different writing metho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0871
Inventor 袁清波许建卫刘新春邵宗有
Owner DAWNING INFORMATION IND BEIJING
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products