Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Reducing power consumption at a cache

A high-speed cache and high-speed technology, applied in the field of memory systems, can solve the problem of consuming a considerable amount of power, and achieve the effect of reducing tag access and power consumption

Inactive Publication Date: 2007-05-30
FUJITSU LTD
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] The cache memory on the processor usually consumes a considerable amount of power

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reducing power consumption at a cache
  • Reducing power consumption at a cache
  • Reducing power consumption at a cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0010] FIG. 1 illustrates an example non-uniform cache architecture for reducing power consumption across cache 10 . In a particular embodiment, cache 10 is a component of a processor that temporarily stores code for execution on the processor. References to "code" include one or more executable instructions, other code, or both, where appropriate. Cache 10 includes a number of sets 12 , a number of ways 14 and a number of tags 16 . Group 12 logically intersects multiple ways 14 and multiple tags 16 . A logical crossing between a set 12 and a way 14 includes a plurality of adjacent memory locations in the cache 10 for storing code. A logical intersection between set 12 and tag 16 includes one or more memory locations in cache 10 that are adjacent to each other for storing codes stored in cache 10, codes stored in The code in the cache 10 identifies, or data for locating and identifying the code stored in the cache 10 . By way of example and not limitation, the first logica...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

ReducING power consumption at the cache. In one embodiment, a method for reducing power consumption at a cache includes determining a nonuniform architecture for a cache providing an optimum number of cache ways for each cache set in the cache. The nonuniform architecture allows cache sets in the cache to have associativity values that differ from each other. The method also includes implementing the nonuniform architecture in the cache to reduce power consumption at the cache. In another embodiment, the method also includes determining a code placement according to which code is writeable to a memory separate from the cache. The code placement reduces occurrences of inter cache-line sequential flows when the code is loaded from the memory to the cache. The method also includes compiling the code according to the code placement and writing the code to the memory for subsequent loading from the memory to the cache according to the code placement to further reduce power consumption at the cache.

Description

technical field [0001] The present invention relates generally to a memory system and, more particularly, to reducing power consumption on cache memory. Background technique [0002] Cache memory on a processor typically consumes a considerable amount of power. As an example, the instruction cache on an ARM920T processor consumes about 25% of the processor's power consumption. As another example, the instruction cache on a StrongARM SA-110 processor (which is targeted for low power applications) consumes about 27% of the processor's power consumption. Contents of the invention [0003] Embodiments of the present invention may reduce or eliminate problems or disadvantages associated with existing memory systems. [0004] In one embodiment, a method for reducing power consumption on a cache is presented, the method comprising the steps of: determining that an optimal number of cache ways is provided for each cache set in said cache The non-uniform architecture of the cach...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F1/32
CPCY02B60/1225G06F2212/271G06F2212/1028Y02B60/1228G06F1/3225G06F1/3275G06F12/0864Y02B60/32Y02D10/00Y02D30/50
Inventor 石原亨法尔赞·法拉赫
Owner FUJITSU LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products