Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System-level cache

A system-level, caching technology that can be used in memory systems, climate sustainability, instrumentation, etc. to solve problems such as the inability of client devices to allocate cache lines to reduce latency, improve power efficiency, and improve power consumption

Pending Publication Date: 2021-06-01
GOOGLE LLC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

So, for example, a cache policy may specify that requests from one or more client devices are always stored in four ways, and that cache lines in those four ways cannot be allocated for any other client device

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System-level cache
  • System-level cache
  • System-level cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0123] Embodiment 1 is a system comprising:

[0124] multiple integrated client devices;

[0125] a memory controller configured to read data from the memory device; and

[0126] a system-level cache configured to cache data requests through the memory controller for each integrated client device of the plurality of integrated client devices,

[0127]wherein the system-level cache includes a cache memory having a plurality of ways, each of the ways being a primary way or a secondary way,

[0128] wherein each main way is dedicated to a single corresponding partition corresponding to a memory buffer accessed by one or more client devices, and

[0129] wherein each secondary route corresponds to a set of multiple partition shares for a set of memory buffers accessed by the client device, and

[0130] wherein the system-level cache is configured to maintain a mapping between partitions and priority levels, and is configured to assign primary ways to the corresponding enabled p...

Embodiment 2

[0131] Embodiment 2 is the system of embodiment 1, wherein the system level cache is configured to assign a primary way exclusively to the first partition accessed by the one or more first client devices, and is configured to assign a secondary way to The first partition and one or more other partitions assigned to be accessed by the group of client devices that also includes the first client device.

Embodiment 3

[0132] Embodiment 3 is the system of embodiment 2, wherein the system level cache is configured to maintain a mapping between groups of client devices and secondary priority levels, and is configured to The secondary roads are assigned to the corresponding enabled partitions in the order corresponding to the corresponding secondary priority levels.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for a system-level cache to allocate cache resources by a way-partitioning process. One of the methods includes maintaining a mapping between partitions and priority levels and allocating primary ways to respective enabled partitions in an order corresponding to the respective priority levels assigned to the enabled partitions.

Description

Background technique [0001] This specification relates to systems having integrated circuit devices. [0002] A system level cache (SLC) is a device that caches data retrieved from memory or data to be stored to the memory of a number of different hardware devices in the system. In other words, different cache lines of the SLC can store data belonging to different hardware devices. [0003] Typically, multiple different hardware devices are different components integrated into a system on chip (SOC). In this specification, a device that provides read requests and write requests through the SLC will be referred to as a client device. [0004] Caching can be used to reduce power consumption by reducing main memory usage. In other words, the main memory and the path to the main memory can be placed in a low power state as long as the client devices can access the data they need in the cache. [0005] Caches are usually organized as collections with multiple ways. The request...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/084G06F12/0846
CPCG06F12/084G06F12/0848G06F13/1694Y02D10/00G06F12/0811G06F12/0877G06F12/0815
Inventor 维诺德·沙马蒂马啸宇尹洪一基思·罗伯特·普夫勒德雷尔廖维平本杰明·道奇阿尔伯特·迈克斯纳艾伦·道格拉斯·克尼斯马努·古拉蒂拉胡尔·贾格迪什·塔库尔杰森·鲁珀特·莱德格雷夫
Owner GOOGLE LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products