Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for forecasting task resource waiting time

A technology of waiting time and predicting time, applied in resource allocation, multi-programming devices, etc., can solve the problems of difficult mixing probability distribution model description, inaccurate probability model evaluation and prediction, and non-obedience to a single one, so as to optimize task resource requirements Effects of configuration, optimized job scheduling, reliable forecast rate

Active Publication Date: 2015-04-15
中科海微(北京)科技有限公司
View PDF3 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The method of describing resource distribution based on a specific probability model will have the following practical problems in the practical application of large-scale computing system management and big data processing: the historical load records of massive data are difficult to obey a specific probability in practical applications In fact, by examining different actual data, the historical load records of big data not only do not obey a single probability distribution, but are even difficult to describe with a mixed probability distribution model; some commonly used probability models (such as the binomial distribution model ( binomial model) and its derived models), the assumption that the properties of jobs submitted by users in a short time interval are independent of each other is often not true in reality. In fact, through specific research on the historical load data of actual supercomputers, we found that most users Jobs with similar content and equivalent parameters will be submitted multiple times in a short period of time, so it is often inaccurate to use this type of probability model to evaluate and predict resource consumption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for forecasting task resource waiting time
  • Method and system for forecasting task resource waiting time
  • Method and system for forecasting task resource waiting time

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047]In the process of studying the historical load of the supercomputer Kraken, the inventor found that the waiting time of about 52.4% of the jobs (tasks) submitted by users in the job execution queue exceeded the actual running time. For each job, especially parallel jobs, there are two basic parameters that express its demand for computing resources: running time and the number of computing nodes (CPU, memory). From the user's point of view, the present invention hopes to adjust these two parameters under the prerequisite of ensuring the correct result, so that the job can be run quickly and reduce the waiting time in the job execution queue. In the process of statistical analysis, a user submits multiple similar jobs (such as the same execution program, different input data, etc.) within a short period of time, which greatly affects the effective statistical results, because the system can simultaneously execute The number of jobs is generally limited. The present inven...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a system for forecasting task resource waiting time and relates to the large-scale calculating system resource management, optimization and distribution, in particular to the method and system for forecasting the task resource waiting time. The method for forecasting the task resource waiting time includes that acquiring historical task records, deleting task records with dependency relationships from the historical task records, and generating new historical task records; acquiring task records in the time period related to the time period to be forecast from the new historical task records through self-correlation functions to generate a task record set; setting a task resource waiting time threshold, acquiring the number of the task records of which the task resource waiting time is longer than the task resource waiting time threshold from the task record set, and forecasting the task resource waiting time in the time period to be forecast through a bayesian method according to the total amount of the task records in the task record set. The method and system for forecasting the task resource waiting time are capable of forecasting the usability of the calculating system resource and optimizing the task dispatching.

Description

technical field [0001] The invention relates to resource management, optimization and allocation of large-scale computing systems (including supercomputing and cloud computing), and in particular to a method and system for predicting the waiting time of task resources. Background technique [0002] Many existing approaches use model-based approaches to characterize, predict, optimize, and allocate resource usage in large-scale computing systems. Specifically, researchers first select one or several dimensions related to system resource usage for data tracking observation (such as the remaining running time of jobs, the queuing time of jobs in system queues, etc.), and then apply some correlation A probability model is used to describe the probability distribution of such dimensional data. Next, researchers use the probability distribution properties presented by this model to predict the future performance of the system, so as to achieve resource optimization and reasonable ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50
Inventor 尤海航邢飞
Owner 中科海微(北京)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products