Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A cache memory replacement method and system based on usage heat

A cache and memory technology, applied in memory systems, instruments, computing, etc., can solve the problems of inability to change the replacement strategy, large resource usage, low hit rate, etc., to increase replacement flexibility, less resource usage, general purpose high sex effect

Active Publication Date: 2021-12-14
TIH MICROELECTRONIC TECH CO LTD +1
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, an algorithm with a high cache hit rate is generally complex to implement and consumes a lot of resources; while an algorithm with a simple algorithm and a small resource occupation will have a low hit rate, or rely heavily on code execution rules, have low flexibility, and cannot be flexibly changed according to the code The question of replacement strategies

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A cache memory replacement method and system based on usage heat
  • A cache memory replacement method and system based on usage heat

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] The present disclosure provides a cache memory replacement method based on usage heat, including:

[0041] S1: if figure 1 As shown, the cache memory Cache is divided into n Cache blocks, and the n Cache blocks and their corresponding heat values ​​form a heat comparison group;

[0042] In this comparison group, there are a total of n blocks for heat comparison, and the comparison block is named B1 ~B n , and its corresponding count value is C 1 ~C n .

[0043] S2: Define parameter values

[0044] a. Initial heat value S: After data replacement, the initial count value of Cx corresponding to the data block Bx.

[0045] b. Heat decay factor b: After the block is not hit, the decreasing value of the heat count; this value can be a fixed value, or it can change with the change of the heat value.

[0046] c. Popularity enhancement factor i: After hitting the block, the increased value of the popular count; this value can be a fixed value, or it can change with the cha...

Embodiment 2

[0059] The present disclosure provides a usage heat based cache memory replacement system comprising:

[0060] A block module, which is used to divide the cache memory Cache into n Cache blocks, and the n Cache blocks and their corresponding heat values ​​form a heat comparison group;

[0061] A data reading module, which is used to judge whether the CPU data to be read exists in the Cache according to the received CPU data reading request, if hit, find the hit Cache block and its corresponding heat value in the heat comparison group, Increase the heat value according to the preset heat enhancement factor, and the heat value of the rest of the miss Cache block is attenuated according to the heat decay factor, and read the CPU data in the hit Cache block;

[0062] Replacement module, which is used to find the Cache block with the smallest heat value and less than or equal to the replacement threshold in the heat comparison group if there is a miss, replace the CPU data to be re...

Embodiment 3

[0064] The present disclosure provides an electronic device, which is characterized in that it includes a memory, a processor, and computer instructions stored in the memory and run on the processor. When the computer instructions are executed by the processor, a high-speed Steps described in the buffer memory replacement method.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present disclosure discloses a cache memory replacement method and system based on usage heat, including: dividing the Cache into n Cache blocks, and forming a heat comparison group between the n Cache blocks and their corresponding heat values; Get the request, judge whether the CPU data to be read exists in the Cache, if it is hit, find the hit Cache block and its corresponding heat value, increase the heat value according to the preset heat enhancement factor, and the heat value of the rest of the missed Cache blocks according to The heat decay factor is used to attenuate, and the CPU data is read in the hit Cache block; if it is not hit, find the Cache block with the smallest heat value and less than or equal to the replacement threshold, replace the CPU data to be read into the Cache block, and the rest The heat value of the unreplaced cache block is attenuated according to the heat decay factor. Count the access frequency of the code in the Cache, and replace it after the access heat decreases; through the heat enhancement factor and heat decay factor parameters, it is suitable for different execution codes and maintains a high hit rate.

Description

technical field [0001] The present disclosure relates to the technical field of data storage and reading, and in particular to a cache replacement method and system based on usage heat. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] Cache memory (Cache) is a storage system located between the CPU (Central Processing Unit) and DRAM (Dynamic Random Access Memory) or flash (flash memory). Generally, Cache has a smaller capacity and faster speed than DRAM or flash. The speed of the CPU is much higher than that of the memory. When the CPU directly accesses data from the memory, it has to wait for a certain period of time, while the Cache can save a part of the data that the CPU has just used or recycled. If the CPU needs to use this part of the data again, it can Call directly from the Cache, thus avoiding repeated access to data, reducing t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/0808
CPCG06F12/0808
Inventor 刘超张洪柳于秀龙
Owner TIH MICROELECTRONIC TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products