Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Context switch data prefetching in multithreaded computer

a multi-threaded computer and context switch technology, applied in multi-programming arrangements, memory adressing/allocation/relocation, instruments, etc., can solve the problems of increasing the occurrence of cache misses once a thread is switched back, affecting the overall system performance, and reducing the penalty for cache-related context switching performance. , the effect of increasing the overall system performan

Inactive Publication Date: 2008-08-21
INT BUSINESS MASCH CORP
View PDF21 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0018]The invention addresses these and other problems associated with the prior art by initiating, in connection with a context switch operation, a prefetch of data likely to be used by a thread prior to resuming execution of that thread. Put another way, once it is known that a context switch will be performed to a particular thread, embodiments consistent with the invention initiate prefetching of data on behalf of that thread so that when execution of the thread is resumed, more of the working state for the thread is likely to be cached, or at least in the process of being retrieved into cache memory. As a result, in many instances the cache-related performance penalties associated with context switching can be reduced, and thus overall system performance can be increased.

Problems solved by technology

While multithreading in this nature can significantly increase system performance, however, some inefficiencies exist as a result of switching between executing different threads in a given execution path.
One undesirable side effect of performing a context switch in many environments, however, is the increased occurrence of cache misses once a thread is switched back in.
Often, a computer relies on a relatively large, slow and inexpensive mass storage system such as a hard disk drive or other external storage device, an intermediate main storage memory that uses dynamic random access memory devices (DRAM's) or other volatile memory storage devices, and one or more high speed, limited capacity cache memories, or caches, implemented with static random access memory devices (SRAM's) or the like.
Whenever a memory access request attempts to access a memory address that is not cached in a cache memory, a “cache miss” occurs.
As a result of a cache miss, the cache line for a memory address typically must be retrieved from a relatively slow, lower level memory, often with a significant performance hit.
However, given that the same premise applies to all of the threads executing in a multithreaded computer, whenever a thread is suspended as a result of a context switch, and then is later resumed as a result of another context switch, it is likely that some or all of the instructions and data that were cached prior to suspending the thread will no longer be cached when the thread is resumed (principally due to the caching of instructions and data needed by other threads that were executed in the interim).
A greater number of cache misses then typically occur, thus negatively impacting overall system performance.
Prefetching and branch prediction, which rely on historical data, also typically provide little or no benefit for a resumed thread upon its initial resumption of execution, as the prefetching of instructions and data cannot be initiated until after the thread resumes its execution.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Context switch data prefetching in multithreaded computer
  • Context switch data prefetching in multithreaded computer
  • Context switch data prefetching in multithreaded computer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032]The embodiments discussed hereinafter utilize context switch prefetching to prefetch data likely to be used by a thread prior to resumption of execution of the thread. In this context, data that is likely to be used by a thread may be considered to include both the instructions that are executed by a thread, as well as the data that is processed by those instructions as a result of their execution.

[0033]As will become more apparent below, context switch prefetching may be used to prefetch data for a thread in connection with a context switch to that thread, or in the alternative, in connection with a context switch to another thread (e.g., when the thread for which the data is prefetched will be resumed upon the next context switch). Moreover, the prefetching may be software- or hardware-based, and may be performed for instructions, data to be processed by instructions, or both. Various methods of initiating a prefetch, including issuing a touch instruction, programming a hard...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An apparatus, program product and method initiate, in connection with a context switch operation, a prefetch of data likely to be used by a thread prior to resuming execution of that thread. As a result, once it is known that a context switch will be performed to a particular thread, data may be prefetched on behalf of that thread so that when execution of the thread is resumed, more of the working state for the thread is likely to be cached, or at least in the process of being retrieved into cache memory, thus reducing cache-related performance penalties associated with context switching.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application is a divisional of U.S. patent application Ser. No. 10 / 739,738 filed on Dec. 18, 2003 by Jeffrey P. Bradford et al. In addition, this application is related to U.S. patent application Ser. No. 10 / 739,739, filed Dec. 18, 2003, Jeffrey P. Bradford et al., and entitled “CONTEXT SWITCH INSTRUCTION PREFETCHING IN MULTITHREADED COMPUTER” (ROC920030286US1) and U.S. patent application Ser. No. ______, filed on even date herewith by Jeffrey P. Bradford, et al. (ROC920030285US2), which is also a divisional of the aforementioned '738 application. The entire disclosures of each of these applications are incorporated by reference herein.FIELD OF THE INVENTION[0002]The invention relates to computers and computer software, and in particular to prefetching of instructions and data in a multithreaded computer system.BACKGROUND OF THE INVENTION[0003]Given the continually increased reliance on computers in contemporary society, computer tec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F9/30G06F9/38G06F9/46G06F12/08
CPCG06F9/463G06F2212/6028G06F12/0862
Inventor BRADFORD, JEFFREY POWERSKOSSMAN, HAROLD F.MULLINS, TIMOTHY JOHN
Owner INT BUSINESS MASCH CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products