Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System for settling fierce competition of memory resources in big data processing system

A technology of big data processing and memory resources, which is applied in the field of I/O performance optimization under the computer system structure, can solve the problems of fierce competition for memory resources, excessive competition for memory resources, and the inability to dynamically adjust the ratio of CPU and memory resources, etc., to achieve The effect of reducing overflow disk I/O operations, strong versatility and portability, and efficient use

Active Publication Date: 2016-08-17
HUAZHONG UNIV OF SCI & TECH
View PDF6 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the defects of the prior art, the purpose of the present invention is to provide a system to solve the fierce competition of memory resources in the big data processing system, aiming to solve the problems of excessive competition of memory resources and unnecessary data overflow to Disk I / O operations, technical problems that cannot dynamically adjust the ratio of CPU and memory resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System for settling fierce competition of memory resources in big data processing system
  • System for settling fierce competition of memory resources in big data processing system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0028] Such as figure 1 As shown, the present invention provides a system for solving fierce competition for memory resources in a big data processing system, including a memory information feedback module, an information sampling and analysis module, and a decision-making and task distribution module.

[0029]The memory information feedback module is used to monitor the memory usage of running thread tasks, count the amount of memory consumed during the execution of thread tasks, and also count the amount of data overflowed from the memory to the disk when the memory is insufficient. Calculate th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a system for settling fierce competition of memory resources in a big data processing system. A memory information feedback module is used for monitoring the memory using condition for a running thread task, converting collected memory information and then feeding back the converted memory information to an information sampling and analyzing module; the information sampling and analyzing module is used for dynamically controlling the number of sampling times of information of all working nodes, analyzing data after the assigned number of sampling times is achieved, and calculating the optimal CPU to memory proportion of the current working node; a decision making and task distributing module is used for making a decision and controlling whether new tasks are distributed to the working nodes for calculation operation according to the information obtained through analysis and the task running information of the current working node to achieve effective limit on the CPU and memory use relation. By means of the system, a memory-perceptive task distribution mechanism can be achieved on a universal big data platform, the I / O expenditure generated when data overflows to a disc due to fierce competition of memory resources can be reduced, and the integral performance of the system can be effectively improved.

Description

technical field [0001] The invention belongs to the field of I / O performance optimization under the computer system structure, and more specifically relates to a system for solving fierce competition of memory resources in a big data processing system. Background technique [0002] With the advent of the big data era, cluster programming models MapReduce and Dryad are mostly used for data processing to process growing data sets. These models can provide automatic task scheduling mechanisms, fault tolerance mechanisms, and load balancing mechanisms, and the implementation details are transparent to users. Among them, the MapReduce model is widely used. [0003] The open source distributed processing system Hadoop, as the most typical representative of the MapReduce model, has been used by the industry to process a variety of offline batch processing applications. However, Hadoop has designed a complete set of execution processes based on disks. Intermediate data needs to be ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50
CPCG06F9/5016G06F9/5027G06F9/5083
Inventor 石宣化金海裴成张雄
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products