Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Resource scheduling method, apparatus and device, and readable storage medium

A resource scheduling and computing resource technology, applied in resource allocation, multiprogramming device, program control design, etc., can solve problems such as reducing the task execution efficiency of edge servers and inability to accurately allocate computing resources.

Active Publication Date: 2019-11-05
GUANGDONG UNIV OF TECH
View PDF7 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the object of the present invention is to provide a resource scheduling method, device, device and computer-readable storage medium, which solves the problem that the existing technology cannot accurately allocate computing resources and reduces the task execution efficiency of the edge server

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Resource scheduling method, apparatus and device, and readable storage medium
  • Resource scheduling method, apparatus and device, and readable storage medium
  • Resource scheduling method, apparatus and device, and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] Please refer to figure 1 , figure 1 A flowchart of a resource scheduling method provided by an embodiment of the present invention. The method includes:

[0054] S101: Acquire execution information of a new task, and input the execution information into a task queue.

[0055]Specifically, when the user uses an artificial intelligence application on the terminal, the terminal will send task information of the new task to the edge server. This embodiment does not limit the type of terminal used by the user, for example, it may be a smart phone; or it may be a tablet computer. This embodiment does not limit the specific application of the above-mentioned artificial intelligence applications. Further, this embodiment does not limit the application of artificial intelligence applications, that is, the task type of the new task. For example, it can be image recognition; or it can be text recognition; it can also be voice recognition. The execution time of different types ...

Embodiment 2

[0074] Before using the method in Embodiment 1 to allocate the computing resources of the edge server, a model parameter table needs to be established in advance, so as to obtain the execution information of the new task by using the model parameter table. Please refer to figure 2 , figure 2 A flowchart of another resource scheduling method provided by an embodiment of the present invention.

[0075] S201: Acquire a task model, and calculate a computation amount corresponding to the task model according to the structure of the task model.

[0076]Obtain different task models for performing different tasks, decompose each task model according to the neural network layer, and obtain the neural network layer information of each neural network layer. This embodiment does not limit the method for acquiring the information of the neural network layer. For example, a network layer parameter table may be preset, and the network layer information of different neural network layers ...

Embodiment 3

[0083] The following describes the resource scheduling apparatus provided by the embodiments of the present invention, and the resource scheduling apparatus described below and the resource scheduling method described above may refer to each other correspondingly.

[0084] Please refer to Figure 5 , Figure 5 A schematic structural diagram of a resource scheduling apparatus provided by an embodiment of the present invention, including:

[0085] The execution information obtaining module 100 is used to obtain the execution information of the new task, and input the execution information into the task queue;

[0086] The current reward calculation module 200 is used to obtain the current state of the task queue, and use the current state to calculate the current reward;

[0087] The task action acquisition module 300 is used to input the current reward and the current state into the deep reinforcement neural network model to obtain the current task action;

[0088] The compu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a resource scheduling method which comprises the following steps: acquiring execution information of a new task, and inputting the execution information into a task queue; obtaining a current state of the task queue, and calculating a current reward by utilizing the current state; inputting the current reward and the current state into a deep reinforcement neural network model to obtain a current task action; allocating the computing resources corresponding to each task in the task queue by utilizing the current task action, and executing each task in the task queue byutilizing the allocated computing resources. Byobtaining a current reward and a current state of a task queue and inputting the current reward and the current state into a deep reinforcement neural network model to obtain a current task action, allocating the computing resources according to the current task action, the computing resources can be allocated more accurately and reasonably, the taskexecution efficiency is further improved, and the task is executed more efficiently; in addition, the invention also provides a resource scheduling device and equipment and a computer readable storagemedium, which also have the above beneficial effects.

Description

technical field [0001] The present invention relates to the technical field of edge computing, and in particular, to a resource scheduling method, apparatus, device, and computer-readable storage medium. Background technique [0002] With the maturity of artificial intelligence technology, cloud servers have been widely used to perform tasks such as image processing, network defense, and personalized recommendation. Due to the increase in the amount of data, this method has poor real-time performance and high data transmission cost. Therefore, edge intelligence combining edge computing and artificial intelligence is proposed. [0003] Edge intelligence uses edge servers to replace cloud servers to perform tasks sent by network edge devices, such as image recognition and text recognition. When there are many kinds of tasks and a large number of tasks, it is necessary to schedule the computing resources of the edge server to complete the tasks more efficiently. In the prior ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/48G06F9/50
CPCG06F9/4881G06F9/5038
Inventor 李培春余荣张浩川黄旭民林华彪
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products