Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Locality with parallel hierarchical copying garbage collection

a garbage collection and hierarchical technology, applied in the field of automatic memory management, can solve the problems of wasting a lot of time on computer programs' and achieve the effect of reducing caches and tlb misses

Inactive Publication Date: 2008-01-03
IBM CORP
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0008]An object of this invention is to reduce cache and TLB misses by changing the order in which a parallel garbage collector copies heap objects.
[0011]Another object of the invention is to provide a garbage collection algorithm that both reduces cache and TLB misses through hierarchical copying and also maintains good scaling on multiprocessors.
[0013]The preferred embodiment of the invention, described in detail below, reduces cache and TB misses and, in this way, improves program run time. Also, parallel garbage collection improves scaling on multi-processor machines.

Problems solved by technology

In operation, computer programs spend a lot of time stalled in cache and TLB misses, because computation tends to be faster than memory access.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Locality with parallel hierarchical copying garbage collection
  • Locality with parallel hierarchical copying garbage collection
  • Locality with parallel hierarchical copying garbage collection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025]In accordance with the present invention, a garbage collection algorithm is provided that achieves hierarchical copy order with parallel garbage collection threads. FIGS. 2 and 3 illustrate, as an example, one suitable computer system in which the present invention may be used. This computer system 100, according to the present example, includes a controller / processor 102, which processes instructions, performs calculations, and manages the flow of information through the computer system 100. Additionally, the controller / processor 102 is communicatively coupled with program memory 104. Included within program memory 104 are a garbage collector 106, operating system platform 110, Java Programming Language 112, Java Virtual Machine 114, glue software 116, a memory allocator 202, Java application 204, a compiler 206, and a type profiler 208. It should be noted that while the present invention is demonstrated using the Java Programming Language, it would be obvious to those of ord...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Disclosed is a garbage collection algorithm that achieves hierarchical copy order with parallel garbage collection threads. More specifically, the present invention provides a garbage collection method and system for copying objects from a from-space to a to-space. The method comprises the steps of (a) having multiple threads that simultaneously perform work for garbage collection (GC), (b) examining the placement of objects on blocks, and (c) changing the placement of objects on blocks based on step (b). Preferably, the method includes the additional step of calculating a placement of object(s) based on step (b), and using the result of the calculation for step (c). For example, the calculation may be used to increase the frequency of intra-block pointers and / or to increase the frequency of siblings on the same block.

Description

BACKGROUND OF THE INVENTION[0001]1. Field of the Invention[0002]This invention generally relates to automatic memory management, and more specifically, the invention relates to methods and systems for copying garbage collection.[0003]2. Background Art[0004]In operation, computer programs spend a lot of time stalled in cache and TLB misses, because computation tends to be faster than memory access. For example, Adl-Tabatabai et al. report that the SPECjbb2000 benchmark spends 45% of its time stalled in misses on an Itanium processor [Ali-Reza Adl-Tabatabai, Richard L. Hudson, Mauricio J. Serrano, and Sreenivas Subramoney. Prefetch injection based on hardware monitoring and object metadata. In Programming Language Design and Implementation (PLDI), 2004]. Better locality reduces misses, and thus improves performance. For example, techniques like prefetching or cache-aware memory allocation improve locality, and can significantly speedup the performance of a program.[0005]Locality is in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F12/00
CPCG06F12/0253
Inventor SIEGWART, DAVID K.HIRZEL, MARTIN J.
Owner IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products