Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Transparent sharing of memory pages using content comparison

a technology of memory pages and content comparison, applied in the field of memory management, can solve the problems of system memory being usually fast, having weaknesses, and being volatil

Inactive Publication Date: 2009-11-17
VMWARE INC
View PDF3 Cites 79 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The invention provides a method and system for sharing memory units in a computer system with hardware memory and context. The system identifies virtual memory units with identical contents and maps them to a single instance of a corresponding hardware memory unit. The virtual memory units are hashed to identify identical content, and a data structure is used to keep track of previous data structure entries. The system can also defer mapping of virtual memory units to improve efficiency. The invention can be implemented in a virtualized embodiment where the virtual machine includes an intermediate mapping layer to optimize the use of hardware memory. The virtual memory units can be partitioned into different classes for more efficient sharing of hardware memory units.

Problems solved by technology

Although system memory is usually fast, it does have its weaknesses.
First, it is usually volatile.
Second, for a given amount of data to be stored, system memory takes up more physical space within the computer, is more expensive, and requires more support in terms of cooling, component sockets, etc., than does a conventional non-volatile storage device such as a disk.
Thus, whereas many gigabytes of disk storage are commonly included in even computers in the relatively unsophisticated consumer market, such computers seldom come with more than 128 or perhaps 256 megabytes of system RAM.
Indeed, many applications requiring real-time processing of complex calculations such as voice-recognition software, interactive graphics, etc., will not run properly at all unless a certain amount of RAM is reserved for their use while running.
High-speed system memory is a limited resource and, as with most limited resources, there is often competition for it.
This has become an even greater problem in modern multi-tasked systems, in which several applications may be running or, at least resident in memory, at the same time.
Second, the pages are actually shared, when possible, so that redundant copies can be reclaimed.
One disadvantage of the page-sharing approach described in Bugnion '938 is that the guest OS must be modified to include the necessary hooks.
This limits the use of the Bugnion '938 solution not only to systems where such modifications are possible but also to those users who are willing and knowledgeable enough to perform or at least accept the modifications.
Note that such attempted modifications to commodity operating systems may not be possible for those other than the manufacturer of the operating system itself, and then not without greatly increasing the probability that the modifications will lead to “bugs” or instability elsewhere.
Another disadvantage of the Bugnion '938 system is that it will often fail to identify pages that can be shared by different VMs.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Transparent sharing of memory pages using content comparison
  • Transparent sharing of memory pages using content comparison
  • Transparent sharing of memory pages using content comparison

Examples

Experimental program
Comparison scheme
Effect test

performance example

[0166]One working prototype of the invention exported an interface for configuring page sharing parameters and querying status information. The output given below is a snapshot taken when running the invention on a private build of VMware ESX Server running on a dual-processor x86 platform with support for page sharing and with the speculative hint frame optimization procedure described above.

[0167]In the test from which this snapshot was taken, three VMs were each running from a non-persistent virtual disk and using Windows 2000 as the VOS. Two of the VMs were configured for 128 MB memory, the remaining VM being configured for 64 MB. Each VM ran a simple workload consisting of the Microsoft Internet Explorer web browser, Windows Task Manager, and a command prompt window. Each VMM used a randomized policy to select candidate pages for sharing, and scanned pages at a maximum rate of 50 pages per second.

[0168]The overhead required to use the invention in this test was as follows:

[0169...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A computer system has one or more software contexts that share use of a memory that is divided into units such as pages. In the preferred embodiment of the invention, the contexts are, or include, virtual machines running on a common hardware platform. The contents, as opposed to merely the addresses or page numbers, of virtual memory pages that are accessible to one or more contexts are examined. If two or more context pages are identical, then their memory mappings are changed to point to a single, shared copy of the page in the hardware memory, thereby freeing the memory space taken up by the redundant copies. The shared copy is then preferably marked copy-on-write. Sharing is preferably dynamic, whereby the presence of redundant copies of pages is preferably determined by hashing page contents and performing full content comparisons only when two or more pages hash to the same key.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application claims priority of U.S. patent application Ser. No. 09 / 915,045, filed 25 Jul. 2001 now U.S. Pat. No. 6,789,156, which in turn claims priority of U.S. Provisional Patent Application No. 60 / 293,325, filed 22 May 2001.BACKGROUND OF THE INVENTION[0002]1. Field of the Invention[0003]This invention relates to the field of memory management in computer systems.[0004]2. Description of the Related Art[0005]Most modern computers include at least one form of data storage that has programmable address translation or mapping. In most computers, this storage will be provided by a relatively high-speed system memory, which is usually implemented using solid-state random-access memory (RAM) components.[0006]Although system memory is usually fast, it does have its weaknesses. First, it is usually volatile. Second, for a given amount of data to be stored, system memory takes up more physical space within the computer, is more expensive, an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(United States)
IPC IPC(8): G06F12/02G06F12/10
CPCG06F12/1018G06F12/109G06F2212/151G06F12/1036
Inventor WALDSPURGER, CARL A.
Owner VMWARE INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products