Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Memory management methods and systems that support cache consistency

a memory management and consistency technology, applied in the field of memory management methods and systems that support cache consistency, can solve the problems of limiting the number of memory operations per instruction group, prior attempts to achieve sequential consistency, and lumped architectures with resp

Active Publication Date: 2008-05-20
INTELLECTUAL VENTURES HOLDING 81 LLC
View PDF8 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

"The present invention provides a method or system for maintaining sequential consistency in a lumped architecture, which allows multiple memory operations to be included in an instruction group. The method includes executing a group of instructions that includes a cache line access instruction, updating an indicator to indicate that the cache line has been accessed, and rolling back and reissuing the instruction group if an external agent snoops the cache and the cache line is indicated as having been accessed. This allows for more efficient processing and better performance in a computer system."

Problems solved by technology

Prior attempts to achieve a sequentially consistent, lumped architecture are limited with respect to the number of memory operations that can be included in an instruction group.
Specifically, such attempts are limited to a single memory operation per instruction group.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory management methods and systems that support cache consistency
  • Memory management methods and systems that support cache consistency
  • Memory management methods and systems that support cache consistency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015]Reference will now be made in detail to the various embodiments of the invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with these embodiments, it will be understood that they are not intended to limit the invention to these embodiments. On the contrary, the invention is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the invention as defined by the appended claims. Furthermore, in the following detailed description of the present invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be recognized by one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Methods and systems for maintaining cache consistency are described. A group of instructions is executed. The group of instructions can include multiple memory operations, and also includes an instruction that when executed causes a cache line to be accessed. In response to execution of that instruction, an indicator associated with the group of instructions is updated to indicate that the cache line has been accessed. The cache line is indicated as having been accessed until execution of the group of instructions is ended.

Description

BACKGROUND OF THE INVENTION[0001]1. Field of the Invention[0002]Embodiments of the present invention relate to computer system memory, in particular the management of cache memory.[0003]2. Related Art[0004]With direct memory access (DMA), an input / output (I / O) system can issue read requests and writes directly to main memory without passing through the central processing unit (CPU). However, if the I / O system uses DMA to write to main memory, and changes data cached previously by the CPU, then the CPU will not receive the new data unless the CPU fetches the data from main memory. Also, for DMA reads, the CPU cache may contain more recent data than main memory, and so the I / O system will not receive the new data unless it reads the cache instead of main memory. Multiprocessor systems, particularly systems referred to as shared-memory simultaneous multiprocessor (SMP) architectures, have to deal with similar types of scenarios. The MESI (Modified, Exclusive, Shared, Invalid) protocol ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(United States)
IPC IPC(8): G06F12/00
CPCG06F9/3824G06F9/3834G06F9/3861G06F9/466G06F12/0815G06F9/3885G06F12/0806
Inventor ROZAS, GUILLERMO J.
Owner INTELLECTUAL VENTURES HOLDING 81 LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products