Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Optimization method oriented to task parallel programming model under virtualization environment

A technology of virtualization environment and programming model, which is applied in the optimization field of task-parallel programming model under virtualization environment, can solve the problems of lack of virtualization environment optimization, low performance, waste of computing resources, etc., achieve low overhead, reduce waste, The effect of reducing scheduling delay

Active Publication Date: 2015-07-08
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If a thief thread in the task parallel programming model continues to steal tasks and fails, the virtual CPU executing the thread will waste computing resources
Existing task-parallel programming models, such as Cilk++, TBB, and BWS, lack optimization for virtualized environments in terms of reducing the waste of computing resources by thief threads.
Therefore, the task-parallel programming model faces the problem of low performance in a virtualized environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Optimization method oriented to task parallel programming model under virtualization environment
  • Optimization method oriented to task parallel programming model under virtualization environment
  • Optimization method oriented to task parallel programming model under virtualization environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0059] Table 1 Example 1 Experiment configuration environment

[0060]

[0061] As listed in Table 1, Embodiment 1 deploys 8 customer virtual machines with 16 cores on a single 16-core physical server; respectively start and run 1, 2, 4, and 8 customer virtual machines to simulate a single physical core Scenarios shared by 1, 2, 4, and 8 virtual CPUs; Embodiment 1 runs the Conjugate Gradient (CG) application based on the task parallel programming model in the guest virtual machines 1 to 8, and tests that CG is used in Cilk++ respectively. , BWS and the running time under the support of the present invention; wherein, CG is derived from a set of application program sets representing fluid dynamics calculations developed by NASA; Cilk++ is the most general task parallel programming model; BWS is realized based on Cilk++ It is the best task parallel programming model in the current traditional single-machine multi-core multi-application environment; in embodiment 1, accelerati...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an optimization method oriented to a task parallel programming model under a virtualization environment. Failing stealing operation in the task parallel programming model is obtained through a front-end monitoring part of a client virtual machine; the decision whether acceleration is executed is made through a rear-end acceleration part of a virtual machine monitor according to running states of an acceleration initiator and an accelerated candidate and information of physical CPUs where the acceleration initiator and the accelerated candidate are located. If acceleration is executed, remaining time slices of the acceleration initiator are provided for the accelerated candidate, and when the accelerated candidate is preempted by the virtual machine monitor (a time slice is used up or blocked), if the accelerated candidate is in the runnable state, an original scheduling path of the accelerated candidate is recovered. Optimization aiming at the virtualization environment is added to the existing task parallel programming model, waste caused by a virtual CPU of a thief thread for computational resources is reduced, scheduling delay for running of a virtual CPU of a useful thread is shortened, and physical computational resources are put into effective computation to the greatest extent.

Description

technical field [0001] The invention belongs to the technical field of virtualization and parallel program optimization, and more specifically relates to a task-oriented parallel programming model optimization method under a virtualization environment. Background technique [0002] With the increase in the number of processor cores, the computer programming model has changed from the traditional serial programming model to the new parallel programming model to achieve the same practical effect as the number of cores increases; in recent years, the task parallel programming model has been widely used in development Parallel applications designed to simplify parallel programming and improve multi-core utilization. [0003] The core technology of the task-parallel programming model is task-stealing scheduling, that is, each processor core corresponds to a thread, and each thread maintains a double-ended queue. The tail of the queue is used to push ready tasks or pop out tasks t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/44G06F9/455
Inventor 吴松金海彭亚琼
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products