Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A memory pool and memory allocation method

A technology of memory allocation and memory pool, which is applied in the field of data processing, can solve problems such as invalidation and reduce the effective utilization rate of CPU Cache, and achieve the effect of improving the hit rate

Active Publication Date: 2018-08-03
NEUSOFT CORP
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If the capacity of the memory pool is large, the system will frequently access a large range of memory, which will not only cause the CPU cache to fail when the system accesses the memory object, but also occupy a large amount of CPU cache resources. The valid data is swapped out, reducing the effective utilization of the CPU Cache

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A memory pool and memory allocation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following will clearly and completely describe the technical solutions in the embodiments of the application with reference to the drawings in the embodiments of the application. Apparently, the described embodiments are only some of the embodiments of the application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0025] When the existing memory pool performs high-throughput network data forwarding, a large number of memory object allocation and release operations will be performed. If the capacity of the memory pool is large, the system will frequently access a large range of memory, which will not only cause the CPU cache to fail when the system accesses the memory object, but also occupy a large amount of CPU cache resources. Effective data is swapped out, reducing the effective utilizatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a memory pool. The memory pool includes at least two-level queues, and the at least two-level queues have a corresponding relationship with memory objects in the memory pool, and are used to allocate and release their corresponding memory objects; The at least two-level queues include a first-level queue; when an external module applies for a memory object from the memory pool, the first-level queue is preferentially used to allocate a memory object for the external module. The memory pool provided by the present invention can comply with the principle of access locality, and even when high-throughput network data forwarding is performed, the system will not frequently access a wide range of memory, and the CPU cache will not be invalid when the system accesses memory objects. , and will not swap out the valid data in the CPU's Cache, which greatly improves the hit rate of the CPU's Cache.

Description

technical field [0001] The invention relates to the field of data processing, in particular to a memory pool and a memory allocation method. Background technique [0002] Currently, memory objects in a traditional memory pool are allocated and released by a queue. When the memory pool is initialized, the pointers of each memory object are stored in the above queue, and when the external module applies for the memory object from the memory pool, the pointer is read from the queue and returned to the external module, so that the external module can use the pointer to obtain the memory object. When an external module releases a memory object to the memory pool, the queue enqueues a pointer to the memory object. [0003] Since the allocation and release of memory objects is based on the first-in-first-out principle of the queue, a large number of memory object allocation and release operations will be performed when performing high-throughput network data forwarding. If the c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/50G06F12/0871
Inventor 金健
Owner NEUSOFT CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products