Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for managing memory resources in cluster system, and network system

A cluster system, memory resource technology, applied in transmission systems, storage systems, data exchange networks, etc., can solve the problem that idle memory resources are not used, the utilization rate of memory resources in cluster systems is low, and memory servers are difficult to meet large-capacity requirements. Memory requirements and other issues to meet the needs of large-capacity memory and improve utilization

Active Publication Date: 2009-12-02
HUAWEI TECH CO LTD +1
View PDF0 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In the prior art, a single memory server maintains a free memory pool as a remote buffer for each physical host. When the number of physical hosts increases, the capacity of the free memory pool maintained by the memory server increases, and one memory server is very slow. It is difficult to meet the large-capacity memory requirements; and the idle memory resources of each physical host are not utilized, so that the utilization rate of memory resources in the cluster system is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for managing memory resources in cluster system, and network system
  • Method and device for managing memory resources in cluster system, and network system
  • Method and device for managing memory resources in cluster system, and network system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] refer to Figure 1a , Embodiment 1 of the present invention provides a method for managing memory resources in a cluster system, the method comprising:

[0040] 101a. The requester sends a remote cache request message to the arbitration server, where the remote cache request message includes the remote cache capacity required by the local virtual machine.

[0041] 102a. The requester receives the supplier's information sent by the arbitration server, and the supplier's information is the free memory information reported by the arbitration server according to the physical host in the cluster system and the remote cache capacity required by the local virtual machine acquired.

[0042] 103a. The requester uses the free memory of the provider according to the information of the provider.

[0043] Wherein, the requester and the provider are different physical hosts in the cluster system.

[0044] In Embodiment 1 of the present invention, the requester receives the informat...

Embodiment 2

[0046] refer to Figure 1b , Embodiment 2 of the present invention provides a method for managing memory resources in a cluster system, the method comprising:

[0047] 101b. Receive free memory information reported by physical hosts in the cluster system;

[0048] 102b. Receive a message requesting remote cache sent by the requester, where the message requesting remote cache carries the remote cache capacity required by the virtual machine of the requester;

[0049] 103b. Determine the provider according to the remote cache capacity required by the virtual machine of the requester and the free memory information reported by the physical host in the cluster system, and send the information of the provider to the requester, so that the requester The free memory of the provider is utilized, wherein the requester and the provider are different physical hosts in the cluster system.

[0050]The subject of execution of the method is the arbitration server, and the arbitration serve...

Embodiment 3

[0054] refer to figure 2 , Embodiment 3 of the present invention provides a method for managing memory resources in a cluster system. The method uses a fast switching network to migrate the page content of the requester to the free memory of the provider for storage. The method includes:

[0055] 201. Each physical host in the cluster system predicts the size of the free memory it can provide according to the operation status of its own virtual machine, and periodically sends a heartbeat message to the arbitration server, and the heartbeat message carries the size of the free memory; the arbitration server can use a The data structure or data table saves the free memory size of each physical host, and periodically updates the free memory size of each physical host.

[0056] In this step, the physical hosts in the cluster system can use the following methods to predict the size of free memory they can provide:

[0057] The guest operating system Guest OS on the physical host ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a method and a device for managing memory resources in a cluster system. The method for managing the memory resources in the cluster system comprises: sending a message for requesting remote caching to an arbitration server by a requesting party, wherein the message for requesting the remote caching comprises remote caching capacity required by a local virtual machine; receiving information of a provision party sent by the arbitration server by the requesting party, wherein the information of the provision party is acquired by the arbitration server according to idle memory information reported by a physical host in the cluster system and the remote caching capacity required by the local virtual machine; and utilizing an idle memory of the provision party by the requesting party according to the information of the provision party, wherein the requesting party and the provision party are different physical hosts in the cluster system. By using the technical proposal provided by the embodiment of the invention, the utilization rate of the memory resources in the cluster system can be improved.

Description

technical field [0001] The invention relates to the field of communication technology, in particular to a method, device and network system for managing memory resources in a cluster system. Background technique [0002] Multiple virtual machines (Virtual Machine, VM) are virtualized on a physical host, referred to as virtual machines, on which multiple independent guest operating systems can run simultaneously. Virtual Machine Monitor, VMM) to access actual physical resources. [0003] The prior art provides an exclusive memory server as a buffer between the physical host memory and the local disk. The memory server maintains a free memory pool as the remote buffer of each physical host, so that each physical host The operating system and application software on the virtual machine can use the high-speed network to transparently use the memory resources on the memory server across physical hosts. [0004] The disadvantages of the prior art are: [0005] In the prior art,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/56H04L29/06G06F12/08G06F12/0806G06F12/0866
CPCG06F12/0806G06F12/0866
Inventor 全小飞罗英伟
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products