Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Multi-Level Cache Method Based on Historical Upgrading and Decreasing Frequency

A cache and frequency technology, applied in the field of data reading and writing and storage of computer systems, can solve problems such as insufficient description of data block status, ignoring of valuable historical implicit information, etc., so as to reduce average response time and space consumption. Small, good read and write performance

Active Publication Date: 2017-12-22
SHANGHAI JIAO TONG UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, existing multi-level caching algorithms still have potential for improvement
On the one hand, most multi-level caching algorithms use implicitly stored centralized information of data blocks, but valuable historical implicit information is ignored; on the other hand, some algorithms only save the implicit information of the last few operations, and does not adequately describe the state of the data block

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Multi-Level Cache Method Based on Historical Upgrading and Decreasing Frequency
  • A Multi-Level Cache Method Based on Historical Upgrading and Decreasing Frequency
  • A Multi-Level Cache Method Based on Historical Upgrading and Decreasing Frequency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0033] The present invention provides a multi-level caching method based on historical de-escalation frequency, including:

[0034] Step S1, according to the implicit frequency of each data block in the cache at all levels, wherein the implicit frequency is the sum of the number of times each data block is upgraded and downgraded per unit time;

[0035] Step S2, high implicit frequency queues and low implicit frequency queues are established in caches at all levels, wherein the high implicit frequency queues store data blocks with high implicit frequency, and the low implicit frequency queues store data blocks with the lowest implicit frequency Frequency data blocks; here, in each level of cache of the multi-level cache system, tw...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a multi-level caching method based on historical de-escalation frequency. The present invention is based on historical hidden information of data blocks, and historical hidden information is one of the essence of multi-level cache system. Through hidden frequency, it can effectively The hot data block can be identified accurately, and it can be stored in a higher-level cache for a longer period of time, which increases the data block hit rate of the system and reduces the average response time; the present invention divides the traditional LRU stack into two dedicated The queue makes the hidden information local, avoids the mixing of hot and cold data blocks, and reduces the use of bandwidth between caches at all levels; the invention effectively identifies hot data blocks, and the algorithm makes hot data blocks in the high-level cache It can be stored for a long time, reducing the downgrade and upgrade operations between caches at all levels, and further reducing the bandwidth consumption between caches at all levels; the space consumption of the present invention is very small, which is better for the system under various loads. Read and write performance laid the foundation.

Description

technical field [0001] The invention relates to the field of data reading and writing and storage of computer systems, in particular to a multi-level cache method based on historical de-escalation and de-escalation frequencies. Background technique [0002] In a large data center, heterogeneous storage devices work together to accelerate data read and write operations. Characteristically, the high-level storage device acts as a cache for the low-level storage device, forming a distributed multi-level cache system. In recent years, multi-level cache systems have received increasing attention due to their high I / O performance, low monetary cost, and high flexibility. [0003] In the past two decades, many typical multi-level caching solutions have been proposed to improve the I / O performance of storage systems. One of the most effective methods is to build another cache between different levels, and use implicit identification of hot data blocks in this cache. These hints p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/121
Inventor 李颉吴晨涛过敏意何绪斌冯博黄洵松
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products