Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Computing system load balancing method and device and storage medium

A technology of load balancing and computing systems, applied in the direction of knowledge-based computer systems, computing, computing models, etc., can solve problems such as system crashes, reduce overall system performance, and decrease system processing efficiency, so as to ensure load balancing and improve efficiency effect

Inactive Publication Date: 2020-10-02
STATE GRID ELECTRIC POWER RES INST +3
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, because some hardware differences between the GPU unit and the CPU unit are not considered, the load accumulated on the GPU unit for a long time will reduce the processing efficiency of the system, and may cause a system crash in severe cases, thereby reducing the overall system performance.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computing system load balancing method and device and storage medium
  • Computing system load balancing method and device and storage medium
  • Computing system load balancing method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The technical solutions of the present invention will be further described below in conjunction with the embodiments.

[0039] The computing system load balancing method described in the present invention has the following steps:

[0040] Assuming that a CPU assembly unit is composed of m CPU units of the same type, and a GPU assembly unit is composed of n GPU cards of the same type, these assembly units are numbered.

[0041] Let sample s i ≡{x i ,y i},in Represents the feature vector obtained after feature extraction of the CPU set unit or the GPU set unit, is the feature value, k is the number of features, i is the sample number; y i ∈ {1, 2, 3, 4, 5, 6, 7, 8, 9} represents 9 different levels of load. the y i The smaller the value, the lower the load level. Experiments show that the characteristics that affect the load of a processing unit include: cache size, number of threads, number of ALUs, unit current load, unit user process occupancy, unit waiting fo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a computing system load balancing method, which dynamically gives weight to a GPU (Graphics Processing Unit) through a decision tree, and then dynamically allocates computing tasks by adopting a smooth polling weighting method, so that the efficiency of a cooperative computing system of a CPU (Central Processing Unit) and the GPU is effectively improved, and meanwhile, theload balancing of the system is ensured. The invention further provides a computing system load balancing device based on the method and a storage medium.

Description

technical field [0001] The present invention relates to a computing task allocation strategy, in particular to a computing system load balancing method, device and storage medium. Background technique [0002] With the development of technologies such as cloud computing and artificial intelligence, people's requirements for the computing power of large computers are getting higher and higher. The CPU architecture of traditional computers can no longer provide the computing power for these algorithms that require a lot of calculations. In this situation, GPU shoulders the responsibility of providing large-scale cloud computing capabilities in the era of big data. GPUs are good at parallel computing, but not very good at performing complex processes such as branch prediction and out-of-order execution. The complex process of dealing with branch prediction and out-of-order execution is precisely the strength of the CPU. Therefore, how to use CPU and GPU for collaborative com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/50G06N5/00
CPCG06F9/505G06F9/5083G06F9/5072G06N5/01
Inventor 梅竹俞俊王琳陈晓露夏天许明杰陈海洋庞恒茂
Owner STATE GRID ELECTRIC POWER RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products