Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for directory-based coherence with distributed directory management utilizing prefetch caches

Active Publication Date: 2005-09-29
META PLATFORMS INC
View PDF10 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If one processor updates the data in its cache without informing the other processor in some manner, an inconsistency results, and it becomes possible that the other processor will use a stale data value.
There are at least two major factors affecting cache mechanisms: performance and implementation cost.
If the time to access main memory is too slow, performance degrades significantly, and potential parallelism is lost.
Implementation cost is also an issue because the performance must be obtained at a reasonable cost.
Implementation costs occur by adding additional coherence hardware, or by programming consistency enforcing compilers.
In addition to these two major factors, there are four primary issues to consider when designing a cache coherence mechanism.
Fourth is caches block sizes, which are the size of a line in the cache, and how it further affects system performance.
However, while CDRAMs can have extremely high bandwidth on dedicated busses, they often have no logic execution ability, increasing the coherency problem.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for directory-based coherence with distributed directory management utilizing prefetch caches
  • Method and apparatus for directory-based coherence with distributed directory management utilizing prefetch caches
  • Method and apparatus for directory-based coherence with distributed directory management utilizing prefetch caches

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In the following discussion, numerous specific details are set forth to provide a thorough understanding of the present invention. However, those skilled in the art will appreciate that the present invention may be practiced without such specific details. In other instances, well-known elements have been illustrated in schematic or block diagram form in order not to obscure the present invention in unnecessary detail. Additionally, for the most part, details concerning network communications, electromagnetic signaling techniques, and the like, have been omitted inasmuch as such details are not considered necessary to obtain a complete understanding of the present invention, and are considered to be within the understanding of persons of ordinary skill in the relevant art.

[0033] It is further noted that, unless indicated otherwise, all functions described herein may be performed in either hardware or software, or some combination thereof. In one embodiment, however, the funct...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides for a type of parallel processing architecture in which a plurality of processors has access to a shared memory hierarchy level. A memory hierarchy level has a coherence directory and associated directory data with a plurality of cachelines each associated with different data. Prefetch caches are interconnected to processor memory and a plurality of processor elements, each element interconnected to different buffers. Cache lines are requested from memory, and the requests, responses, and detections therein are available for particular access modes, therein providing additional coherence of the data. Processing of said directory data is performed by processing elements. In one embodiment, the system comprises an integrated prefetch cache.

Description

CROSS-REFERENCED APPLICATIONS [0001] This application relates to co-pending U.S. patent application entitled “A Method and Apparatus for Directory-Based Coherence with Distributed Directory Management” (Docket No. AUS920030719US1), filed concurrently herewith.TECHNICAL FIELD [0002] The present invention relates generally to the field of multiprocessor computer systems and, more particularly, coherence implementation in a multiprocessor system. BACKGROUND [0003] A shared memory processor (SMP) consists of processor nodes and memories combined into a scalable configuration. Each node has one or more processors and its local memory. Optionally, each node has a cache and a cache controller for accessing main memory efficiently and enforcing consistency. However, a shared memory SMP differs from a network of workstations because all nodes share the same global address space. Hence, software techniques for mapping the global address space into local addresses are typically not needed in a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/00G06F12/08
CPCG06F12/0862G06F12/0817
Inventor GSCHWIND, MICHAEL KARLJOHNS, CHARLES RAYTRUONG, THOUNG QUANG
Owner META PLATFORMS INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products