Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data processing method and device based on hybrid memory

A hybrid memory and data processing technology, applied in the computer field, can solve problems such as low data processing efficiency, and achieve the effect of improving efficiency and storage capacity

Pending Publication Date: 2020-05-19
DAWNING INFORMATION IND BEIJING +1
View PDF15 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the embodiments of the present application is to provide a hybrid memory-based data processing method and device to solve the problem of low data processing efficiency in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing method and device based on hybrid memory
  • Data processing method and device based on hybrid memory
  • Data processing method and device based on hybrid memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.

[0030] Since single-node in-memory computing is limited by hardware resources, it faces the problem of hardware scalability when processing larger-scale data. In the context of the rapid development of large-scale distributed data processing technologies represented by MapReduce, people have also begun to implement in-memory computing on distributed systems. This kind of memory computing uses a cluster composed of multiple computers to build a distributed large memory. Through unified resource scheduling, the data to be processed is stored in the distributed memory to achieve fast access and processing of large-scale data.

[0031] figure 1 A schematic structural diagram of a distributed storage system provided for the embodiment of this application, such as figure 1 As shown, the system includes ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data processing method and device based on a hybrid memory. The method is applied to a node in a distributed storage system. The distributed storage system comprises a plurality of nodes which are in communication connection with one another, wherein each node comprises an HFDD and an external memory, the HFDD comprises an internal memory and a solid state disk SSD, the internal memory comprises a random access memory RAM and an NVDIMM, the method comprises the following steps: calculating the popularity of each data, and the popularity represents the access frequencydegree of the corresponding data; and storing the data according to the popularity of the data and the storage capacities respectively corresponding to the memory, the SSD and the external memory. Inthe embodiment of the invention, the HFDD is fault-tolerant distributed data abstraction based on the RAM + NVM hybrid memory, and the data is stored according to the heat of the data, so that on onehand, the storage capacity of the memory is improved, and on the other hand, the data access efficiency is improved.

Description

technical field [0001] The present application relates to the field of computer technology, in particular, to a hybrid memory-based data processing method and device. Background technique [0002] At present, big data computing technology can already process PB-level data. The emergence of the concept of memory computing has its own reasons. In memory computing mode, all data is loaded into memory during the initialization phase, and data and query operations are performed in high-speed memory. The CPU directly reads data from the memory, performs real-time calculation and analysis, reduces disk data access, reduces the impact of network and disk I / O, greatly improves the data throughput and processing speed of calculation processing, and reduces It eliminates the I / O overhead that originally takes up a lot of computing resources. Through the application of memory computing, I / O bottlenecks are avoided. The results calculated in hours or days before can be completed within ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/06
CPCG06F3/067G06F3/0688G06F3/0607G06F3/0623G06F3/0614Y02D10/00
Inventor 郭庆谢莹莹于宏亮
Owner DAWNING INFORMATION IND BEIJING
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products