Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A storage system cache strategy adaptive method

A caching strategy and storage system technology, applied in memory systems, instruments, electrical and digital data processing, etc., can solve problems such as untimely response to changes in data access patterns in storage systems, reduce cache pollution, improve cache hit rate, and improve performance Effect

Active Publication Date: 2018-06-19
LANGCHAO ELECTRONIC INFORMATION IND CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method requires the professionalism of the storage system administrator, and the response to changes in the data access mode of the storage system is not timely.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A storage system cache strategy adaptive method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The present invention will be described in detail below in conjunction with the accompanying drawings. The storage system caching strategy adaptive method includes the following steps:

[0022] (1) Monitor and count the data access requests received by the storage system, and analyze the statistical information of the data access requests to obtain the data access mode of the storage system. The data access mode of the storage system refers to the access characteristics of the data access requests sent by the upper application to the storage system, including the ratio of read requests to write requests for data access, the randomness of data access, and whether there is hot data.

[0023] The statistical function and statistical data records of data access requests can be merged and shared with other functional modules in the storage system, thereby reducing the performance overhead of implementing the request statistical function. At the same time, statistics on acce...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention particularly relates to a storage system cache policy self-adaptation method. The self-adapting method of the cache policy of the storage system obtains the data access mode by statistically analyzing the data access requests of the storage system, and automatically selects a suitable cache policy according to the data access mode. The storage system cache strategy self-adaptation method solves the problem that a single cache strategy of the storage system cannot adapt to complex and changeable business requirements, and the change of the cache strategy needs to be performed manually, so that the storage system can automatically select the most suitable cache strategy according to the actual data access characteristics. The cache strategy of the current data access mode improves the cache hit rate, reduces cache pollution, reduces cache performance overhead, and improves storage system performance.

Description

technical field [0001] The invention relates to the field of computer application technology, in particular to a storage system cache policy self-adaptation method. Background technique [0002] The hardware that constitutes the computer system is diverse, and the IO access performance is also very different. In order to solve the data transmission between hardware devices with different IO performance, people invented the caching technology, and then formulated a series of caching strategies. The cache improves the hit rate of the cache content by designing well-designed data block, prefetch, sequential prefetch, cache replacement and other algorithms. In the storage system, the main caching strategies include: access time-based strategies, access frequency-based strategies, access time and frequency-based strategies, and so on. The traditional caching strategy mainly includes specifying which data needs to be cached, specifying when to perform a cache swap operation, and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/0893G06F12/0877
Inventor 马春
Owner LANGCHAO ELECTRONIC INFORMATION IND CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products