Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Resource scheduling method, device, equipment and readable storage medium

A technology for resource scheduling and computing resources, applied in the directions of resource allocation, multiprogramming device, program control design, etc., can solve problems such as inability to allocate computing resources accurately, reduce task execution efficiency of edge servers, etc., to improve task execution efficiency, The effect of accurate and reasonable allocation and efficient execution of tasks

Active Publication Date: 2022-02-15
GUANGDONG UNIV OF TECH
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the object of the present invention is to provide a resource scheduling method, device, device and computer-readable storage medium, which solves the problem that the existing technology cannot accurately allocate computing resources and reduces the task execution efficiency of the edge server

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Resource scheduling method, device, equipment and readable storage medium
  • Resource scheduling method, device, equipment and readable storage medium
  • Resource scheduling method, device, equipment and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] Please refer to figure 1 , figure 1 It is a flowchart of a resource scheduling method provided by an embodiment of the present invention. The method includes:

[0054] S101: Obtain execution information of a new task, and input the execution information into a task queue.

[0055]Specifically, when a user uses an artificial intelligence application on a terminal, the terminal will send task information of a new task to the edge server. This embodiment does not limit the type of terminal used by the user, for example, it may be a smart phone; or it may be a tablet computer. This embodiment does not limit the application of the artificial intelligence application, and further, this embodiment does not specify the application of the artificial intelligence application, that is, the task type of the new task. For example, it can be image recognition; or it can be text recognition; it can also be speech recognition, and the execution time of different types of tasks can ...

Embodiment 2

[0074] Before using the method in Embodiment 1 to allocate computing resources of the edge server, it is necessary to establish a model parameter table in advance, so as to use the model parameter table to obtain execution information of new tasks. For details, please refer to figure 2 , figure 2 It is a flow chart of another resource scheduling method provided by the embodiment of the present invention.

[0075] S201: Obtain a task model, and calculate a calculation amount corresponding to the task model according to the structure of the task model.

[0076]Different task models for performing different tasks are obtained, and each task model is decomposed according to neural network layers, and neural network layer information of each neural network layer is obtained. This embodiment does not limit the method for obtaining neural network layer information. For example, a network layer parameter table may be preset, and network layer information of different neural networ...

Embodiment 3

[0083] The resource scheduling device provided by the embodiment of the present invention is introduced below, and the resource scheduling device described below and the resource scheduling method described above may be referred to in correspondence.

[0084] Please refer to Figure 5 , Figure 5 A schematic structural diagram of a resource scheduling device provided by an embodiment of the present invention, including:

[0085] Execution information acquisition module 100, configured to acquire the execution information of the new task, and input the execution information into the task queue;

[0086] The current reward calculation module 200 is used to obtain the current state of the task queue, and use the current state to calculate the current reward;

[0087] The task action acquisition module 300 is used to input the current reward and the current state into the deep reinforcement neural network model to obtain the current task action;

[0088] The computing resource ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a resource scheduling method, comprising: acquiring the execution information of a new task, and inputting the execution information into a task queue; acquiring the current state of the task queue, and calculating the current reward by using the current state; inputting the current reward and the current state into depth enhancement Neural network model to obtain the current task action; use the current task action to allocate the computing resources corresponding to each task in the task queue, and use the allocated computing resources to execute each task in the task queue; this method obtains the current reward and the current task queue State and input it into the deep enhanced neural network model to obtain the current task action; and allocate computing resources according to the current task action, which can allocate computing resources more accurately and reasonably, thereby improving task execution efficiency and performing tasks more efficiently; in addition, The present invention also provides a resource scheduling device, equipment, and computer-readable storage medium, which also have the above beneficial effects.

Description

technical field [0001] The present invention relates to the technical field of edge computing, and in particular to a resource scheduling method, device, equipment, and computer-readable storage medium. Background technique [0002] With the maturity of artificial intelligence technology, cloud servers have been widely used to perform tasks such as image processing, network defense, and personalized recommendations. Due to the increase in the amount of data, the real-time performance of this method is poor, and the cost of data transmission is high. Therefore, edge intelligence combining edge computing and artificial intelligence is proposed. [0003] Edge intelligence uses edge servers instead of cloud servers to perform tasks sent by network edge devices, such as image recognition and text recognition. When there are many kinds of tasks and the quantity is large, it is necessary to schedule the computing resources of the edge server to complete the tasks more efficiently....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/48G06F9/50
CPCG06F9/4881G06F9/5038
Inventor 李培春余荣张浩川黄旭民林华彪
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products