Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Memory allocation method based on fine granularity

A memory allocation, fine-grained technology, applied in the direction of program control design, instrumentation, electrical digital data processing, etc., can solve the problems of memory expansion, memory allocation overhead increase, memory deduplication rate reduction, etc.

Active Publication Date: 2018-11-30
UNIV OF SCI & TECH OF CHINA
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

During the running of the virtual machine, the memory management system does not consider the differences in page access characteristics, but uniformly allocates large pages to the virtual machine. Although this memory allocation mechanism brings performance advantages to the virtual machine, it also Problems such as memory expansion, increased memory allocation overhead, and reduced memory deduplication rate are introduced to the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Memory allocation method based on fine granularity
  • Memory allocation method based on fine granularity
  • Memory allocation method based on fine granularity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0022] figure 1 It is a schematic diagram of the implementation operation flow of the fine-grained memory allocation method based on the present invention, figure 2 It is a schematic diagram of the memory usage of the virtual machine under the default configuration of the system.

[0023] This embodiment is based on a fine-grained memory allocation method, which specifically includes the following steps:

[0024] The first step: detection of virtual machine type

[0025] For all running virtual machines, obtain the memory bandwidth of the virtual machine through the hardware performance counter (see attached figure 1 In the process operation box ①), by intercepting the access path of virtual machine input / output (I / O) (see figure 1 In the process operation box ②), obtain the I / O access frequency of the virtual machine; image 3 A schematic diagram of the I / O access path of the virtual machine is given: when the virtual machine initiates I / O requests, these I / O requests wi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a memory allocation method based on fine granularity. The method is characterized by adopting detection of a virtual machine type, detection of the internal page type of a virtual machine, a differential page allocation strategy of fine granularity and a memory dynamic allocation strategy for access sensing. Since the type of the virtual machine is distinguished, pages usedby an I / O intensive type virtual machine and a computational intensive type virtual machine are all small. Compared with a strategy in which a default option of a system is to allocate large pages for virtual machines, the method has the advantages that memory expansion is relieved, the expense for memory allocation is reduced, and the repeated deletion rate of a memory is decreased; meanwhile, as for an access-intensive type virtual machine, allocated anonymous pages of the virtual machine are large, high memory access performance can be maintained, and since allocated Page Cache pages and kernel pages of the access-intensive type virtual machine are small, compared with the strategy in which the default option of the system is to allocate large pages, the method has the advantages thatthe memory expansion is relieved, the expense for memory allocation is reduced, the repeated deletion rate of the memory is increased, and the loss of system performance is reduced as much as possible.

Description

technical field [0001] The invention belongs to the technical field of computer memory management, and specifically relates to a fine-grained implementation of a performance-efficient memory allocation method through a detection mechanism of a virtual machine type and a page type. Background technique [0002] Linux version 2.6.38 introduces the kernel feature of transparent huge pages (Transparent Hugepages), which achieves better performance by improving the efficiency of the processor's memory-mapped hardware. In the Linux memory management system, the size of ordinary pages is 4KB, and these pages are called small pages, while the size of transparent huge pages is 2MB, and these pages are called huge pages. Although large pages have higher memory access performance than small pages, they bring problems such as memory expansion, high allocation overhead, and low memory deduplication rate to the system. During the running of a virtual machine, different virtual machines h...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/455
CPCG06F9/45558G06F2009/45579G06F2009/45583
Inventor 许胤龙刘军明李永坤郭帆李诚吕敏陈吉强
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products