Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Adaptive cache prefetching based on competing dedicated prefetch policies in dedicated cache sets to reduce cache pollution

A caching, adaptive technique used to reduce cache misses. domain, which can solve problems such as increasing hardware overhead

Inactive Publication Date: 2016-11-23
QUALCOMM INC
View PDF8 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this cache prefetch strategy that only replaces dead lines adds hardware overhead to track the timing of accesses to cache lines in the cache

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive cache prefetching based on competing dedicated prefetch policies in dedicated cache sets to reduce cache pollution
  • Adaptive cache prefetching based on competing dedicated prefetch policies in dedicated cache sets to reduce cache pollution
  • Adaptive cache prefetching based on competing dedicated prefetch policies in dedicated cache sets to reduce cache pollution

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] Referring now to the figures, several exemplary aspects of the invention are described. The word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any aspect described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other aspects.

[0025] Aspects disclosed in the detailed description include adaptive cache prefetching based on competitive private prefetch policies in private cache groups to reduce cache pollution. In one aspect, adaptive cache prefetch circuitry for prefetching data into a cache is provided. Instead of attempting to determine the best replacement strategy for the cache, the adaptive cache prefetch circuit is configured to determine a prefetch strategy based on the results of competing dedicated prefetch strategies applied to dedicated cache groups in the cache. In this regard, a subset of the cache sets in the cache are assigned as "private" cache sets. Other non-dedicat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Adaptive cache prefetching based on competing dedicated prefetch policies in dedicated cache sets to reduce cache pollution is disclosed. In one aspect, an adaptive cache prefetch circuit is provided for prefetching data into a cache. The adaptive cache prefetch circuit is configured to determine which prefetch policy to use as a replacement policy based on competing dedicated prefetch policies applied to dedicated cache sets in the cache. Each dedicated cache set has an associated dedicated prefetch policy used as a replacement policy for the given dedicated cache set. Cache misses for accesses to each of the dedicated cache sets are tracked by the adaptive cache prefetch circuit. The adaptive cache prefetch circuit can be configured to apply a prefetch policy to the other follower (i.e., non-dedicated) cache sets in the cache using the dedicated prefetch policy that incurred fewer cache misses to its respective dedicated cache sets to reduce cache pollution.

Description

[0001] priority claim [0002] This application claims the title "ADAPTIVE CACHE REFETCHING BASED ON COMPETING DEDICATED PREFETCH BASED ON COMPETING DEDICATED PREFETCH" filed April 4, 2014 POLICIES IN DEDICATED CACHESETS TO REDUCE CACHE POLLUTION), which is incorporated herein by reference in its entirety. technical field [0003] The techniques of this disclosure relate generally to cache memory provided in computer systems, and more specifically, to prefetching cache lines into cache memory to reduce cache misses. Background technique [0004] A memory unit is the basic building block of computer data storage (also referred to as "memory"). A computer system can read data from and write data to memory. As an example, memory may be used to provide cache memory in a central processing unit (CPU) system. A cache memory, which may also be referred to simply as a "cache", is the storage of copies of data at frequently accessed memory addresses stored in main memory or a hig...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
CPCG06F12/0862G06F12/0864G06F2212/6024Y02D10/00G06F12/0875G06F12/128G06F2212/283G06F2212/602G06F2212/6046
Inventor 哈罗德·韦德·凯恩三世戴维·约翰·帕尔弗雷曼
Owner QUALCOMM INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products