Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Front-end multi-thread scheduling method and system based on cloud platform

A scheduling method and technology of a scheduling system, applied in the direction of multi-program device, program control design, program startup/switching, etc., to achieve the effects of improving response speed and resource utilization, preventing thread loss, reducing delay and thread loss

Inactive Publication Date: 2020-08-14
INSPUR SUZHOU INTELLIGENT TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention proposes a front-end multi-thread scheduling method and system based on a cloud platform, adding a processing node between the front-end and the back-end, so that the cloud platform can respond to all multi-thread requests in time, and solves the problem of high concurrency and a large number of requests The front-end problem improves the response speed of the interface and the utilization rate of resources, and reduces the delay and thread loss of the interface when a large number of requests are made

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Front-end multi-thread scheduling method and system based on cloud platform
  • Front-end multi-thread scheduling method and system based on cloud platform

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] Embodiment 1 of the present invention proposes a front-end multi-thread scheduling method based on a cloud platform. The bottleneck of high concurrent requests for a large number of front-end interfaces is that they cannot reach the back-end when there are too many requests. Often, interface requests fail due to timeout. We add to the front-end A processing node located between the front-end and the back-end no longer directly requests the back-end from the front-end, but has a transit, such as figure 1 A flow chart of a front-end multi-thread scheduling method based on a cloud platform is given.

[0033] In step S101, receive and respond to the multi-thread request from the user, and at the same time sort the multi-thread request to distinguish the source of the multi-thread request; wherein the received multi-thread request is assigned a serial number, wherein the serial number is composed of timestamp, user ID and request Type composition, which is used to distinguis...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a front-end multi-thread scheduling method and system based on a cloud platform. The method comprises the following steps: receiving and responding to a multi-thread requests, and sorting the multi-thread requests so as to distinguish the sources of the multi-thread requests; caching the multi-thread request received from the front end; classifying and packaging the cached messages of the multi-thread requests, sending out an inquiry request by the packaged messages, sending out a complete multi-thread request to a background for execution after normal interface communication is ensured, and finally returning a result of the complete multi-thread request to a corresponding front end; and monitoring whether the multi-thread request obtains a corresponding response ornot. Based on the method, the invention further provides the corresponding system, and the system comprises a message queue, a thread pool, a concentrator, a monitoring module and a front-end resourcelibrary. A processing node is added at the front end, a scheduling mechanism is established, and information processing monitoring is carried out through the monitoring module, so that thread loss isprevented, the multi-thread response speed and the resource utilization rate are improved, and the request delay of an interface is reduced.

Description

technical field [0001] The invention belongs to the technical field of cloud platform front-end architecture, and in particular relates to a cloud platform-based front-end multi-thread scheduling method and system. Background technique [0002] In the cloud platform, resource scheduling is a very important thing. Cloud platform pages often have high concurrent requests. For example, in the process of creating a virtual machine, it is necessary to schedule many modules such as computing, storage, network, monitoring, and logging at the same time. These modules are not related to each other and need to be processed separately. scheduling. When hundreds or even thousands of creation requests are made at the same time, there will be problems of high concurrency and a large number of requests. Due to the limited processing efficiency of the front-end interface and the limited ability of parallel processing, high concurrency and a large number of requests can easily cause system ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/48G06F11/30
CPCG06F9/4806G06F11/3051G06F2209/5011
Inventor 王雪静
Owner INSPUR SUZHOU INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products