Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

LRU flash memory cache management method based on dynamic page weight

A dynamic page and cache management technology, applied in the field of storage systems, can solve the problems of low hit rate, high write consumption, lack of flash cache area management methods, etc.

Active Publication Date: 2019-06-07
HANGZHOU DIANZI UNIV
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, most of the current cache management methods are optimized for hard disk storage devices, and there is a lack of cache area management methods for flash memory.
When the traditional cache management method is applied to flash memory, there is a problem of low hit rate, and the above-mentioned problems of high latency and high write consumption cannot be solved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • LRU flash memory cache management method based on dynamic page weight
  • LRU flash memory cache management method based on dynamic page weight
  • LRU flash memory cache management method based on dynamic page weight

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] A dynamic page weight-based flash memory cache management method provided by the present invention will be further described below with reference to the accompanying drawings.

[0065] see figure 1 , shown as a flow chart of the present invention, the specific implementation of the present invention is as follows:

[0066] Step S1: Read the page requests in the request queue and identify and classify the page request type and the area where they are located, specifically including the following steps:

[0067] Step S11: Preprocess the page request queue, set the page request as R, which contains the request number R pid and page request mode R am ∈{read,write}. The request queue S can be expressed by the following formula:

[0068] S={R 1 , R 2 ,...,R j ,...,R n},1≤j≤n

[0069] Request queue S is multiple page requests R pid A collection of , where n represents the total number of requests in the request queue S, and j represents the page request number. Set ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an LRU flash memory cache management method based on dynamic page weight. The method comprises: S1, page requests in a request queue are read, and page request types and areaswhere the page requests are located are identified and classified; and S2, judging the state of inserting the buffer area, judging the obsolete page by using an LRU method based on the dynamic page weight, adjusting the state of the buffer area, and executing a page request. According to the technical scheme, the buffer area type is divided into the working area and the exchange area, the cold page, the hot page, the dirty page and the clean page are distinguished, the page request type and the buffer area are determined, and then the obsoleted page is judged to complete the page request by combining the LRU method based on the dynamic page weight. According to the technical scheme, read-write consumption and time delay in the flash memory cache read-write process can be effectively reduced, and meanwhile the hit rate in flash memory cache read-write can be greatly increased.

Description

technical field [0001] The invention relates to the field of storage systems, in particular to a dynamic page weight-based LRU flash cache management method. Background technique [0002] NAND flash storage technology has been widely used in enterprise applications due to its high performance, small size, and low energy consumption. However, with the continuous development of big data technology in recent years, the processing and analysis of massive data has put forward higher requirements for the data throughput and I / O delay of the storage system. Defects such as symmetric I / O delay and block erasure make it impossible to completely replace hard disk storage. [0003] Combining caching technology with storage devices can effectively reduce I / O latency and reduce asymmetry in different storage layers. However, most of the current cache management methods are optimized for hard disk storage devices, and there is a lack of a cache area management method for flash memory. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/0871G06F12/123
Inventor 袁友伟陶文鹏张锦涛贾刚勇鄢腊梅
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products