Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache replacement method under heterogeneous memory environment

A cache replacement and heterogeneous memory technology, which is applied in the field of computer science, can solve problems such as improving cache usage efficiency, inconsistent cache miss costs, and cache replacement algorithms affecting cache usage efficiency.

Active Publication Date: 2015-08-12
HUAZHONG UNIV OF SCI & TECH
View PDF6 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The quality of the cache replacement algorithm directly affects the efficiency of the cache, which in turn can affect the overall performance of the system
The traditional cache replacement algorithm represented by LRU and its derivative algorithm has achieved good performance in the traditional DRAM memory environment, but in the heterogeneous memory environment, the cache miss penalty (Miss Penalty) is inconsistent, that is, the DRAM and PCM The new feature of inconsistent access latency has not been fully considered in the traditional cache replacement algorithm, which raises the problem of improving cache usage efficiency in a heterogeneous memory environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache replacement method under heterogeneous memory environment
  • Cache replacement method under heterogeneous memory environment
  • Cache replacement method under heterogeneous memory environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] This embodiment records a cache replacement method in a heterogeneous memory environment. like figure 1 As shown, the cache replacement method includes: adding a source flag in the cache line hardware structure, which is used to mark whether the cache line data comes from DRAM or PCM; figure 2 As shown, the cache replacement method also includes: adding a sampling storage unit in the CPU, and statistical data reuse distance information; image 3 Shown is the format of the reuse distance statistics table in the cache replacement method; as Figure 4 As shown, the cache replacement method also includes three sub-methods: a sampling method, an equivalent location calculation method and a replacement method. position, the replacement submethod is used to determine the cache line that needs to be replaced.

[0054] The source flag is counted as I, and its setting method is: when the cache misses and needs to read data from the memory, the received data block is determine...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cache replacement method under heterogeneous memory environment. The method is characterized by comprising the steps: adding one source flag bit in a cache line hardware structure for flagging whether cache line data is derived from a DRAM (Dynamic Random Access Memory) or a PCM (Phase Change Memory); adding a new sample storage unit in a CPU (Central Processing Unit) for recording program cache access behaviors and data reusing range information; the method also comprises three sub methods including a sampling method, an equivalent position calculation method and a replacement method, wherein the sampling sub method is used for performing sampling statistics on the cache access behaviors; the equivalent position calculation sub method is used for calculating equivalent positions, and the replacement sub method is used for determining a cache line needing to be replaced. According to the cache replacement method, for the cache access characteristic of a program under the heterogeneous memory environment, a traditional cache replacement policy is optimized, the high time delay cost that the PCM needs to be accessed due to cache missing can be reduced by implementing the cache replacement method, and thus the cache access performance of a whole system is improved.

Description

technical field [0001] The invention belongs to the technical field of computer science, and more specifically relates to a cache replacement method in a heterogeneous memory environment. Background technique [0002] The development of traditional memory technology represented by Dynamic Random Access Memory (DRAM) has encountered a bottleneck in recent years. Restricted by the manufacturing process, it is becoming more and more difficult to obtain larger capacity DRAM memory at a lower cost; and as the capacity continues to increase, the problem of high energy consumption of DRAM memory is becoming increasingly prominent. The rise and development of new non-volatile memory (Non-Volatile Memory, NVM for short) technology provides an opportunity to break the traditional DRAM memory, which is the bottleneck of system performance and energy consumption. Non-volatile memory devices represented by Phase Change Memory (PCM) have good scalability, and are closer to the delay and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/12
Inventor 廖小飞刘东金海
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products