Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Queuing network model training method, queuing optimization method, equipment and medium

A queuing network and training method technology, applied in the field of queuing optimization processing, can solve problems such as calculation errors, non-compliance, and inability to automate, and achieve high prediction accuracy

Pending Publication Date: 2022-04-08
SHANGHAI CLEARTV CORP LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Generally, queuing theory algorithms are used to construct queuing mathematical models to deal with queuing problems. However, the assumptions in queuing theory on the probability distribution of arriving data (such as Poisson distribution) may be due to the long-tailed distribution of arriving data at the scene. On-site conditions do not match, leading to calculation errors
Moreover, using manual analysis of on-site data and confirming the characteristics of the site, first, it cannot automatically make adjustments according to changes in the site conditions (such as: changes in the queuing sequence, adjustments to the number of queues); second, it cannot automatically establish a system that can be deployed in practice. A very general mathematical model for various queuing scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Queuing network model training method, queuing optimization method, equipment and medium
  • Queuing network model training method, queuing optimization method, equipment and medium
  • Queuing network model training method, queuing optimization method, equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0045] A training method for a queuing network model, such as figure 1 As shown, the training method includes:

[0046] Step 11, obtaining the real-time queuing network diagram; the real-time queuing network diagram includes a plurality of real-time queuing nodes and the node characteristics of each real-time queuing node;

[0047] Among them, after the on-site data is obtained, the data is cleaned, including standardization, normalization, and regularization processing, for subsequent training. The data includes but is not limited to the number of nodes (each node is a queuing point), node Adjacency matrix (generally a fully connected graph, you can adjust whether the edge exists according to the site situation) and node characteristics (including queuing item, gender, age, number of nodes in the queue, processor ID, processing item number, other characteristics of queuing people, node other characteristics and start queue times).

[0048] Step 12, inputting the real-time q...

Embodiment 2

[0067] A queuing optimization method such as Figure 4 As shown, the optimization method includes:

[0068] Step 41, obtaining the queuing network map to be processed; the queuing network map to be processed includes a plurality of nodes to be processed and the node characteristics of each node to be processed;

[0069] Step 42, input the queuing network diagram to be processed into the queuing network model obtained by the training method of the queuing network model in embodiment 1, and output the optimized queuing network diagram;

[0070] Wherein, the optimized queuing network graph includes optimized features of each node to be processed.

[0071] Step 43 , generating an access sequence of each node to be processed based on the optimization feature.

[0072] Among them, the feature of each node is directly output by the decoder according to the one-hot format, which is used to identify the classification of the node.

[0073] Step 44: Process the optimized features of ...

Embodiment 3

[0082] An electronic device, comprising a memory, a processor, and a computer program stored on the memory and operable on the processor, wherein the processor implements the queuing network model described in Embodiment 1 when executing the computer program training method, and / or, the queuing optimization method described in Embodiment 2 is implemented when the computer program is executed.

[0083] Figure 5 A schematic structural diagram of an electronic device provided in this embodiment. Figure 5 A block diagram of an exemplary electronic device 90 suitable for use in implementing embodiments of the invention is shown. Figure 5 The electronic device 90 shown is only an example, and should not limit the functions and scope of use of the embodiments of the present invention.

[0084] Such as Figure 5 As shown, electronic device 90 may take the form of a general-purpose computing device, which may be a server device, for example. Components of the electronic device 9...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a training method of a queuing network model, a queuing optimization method, equipment and a medium. The training method comprises the following steps: acquiring a real-time queuing network diagram; inputting the real-time queuing network graph as training data into a to-be-trained first graph neural network model; the first graph neural network model comprises a trained first model corresponding to the processor; in the training process, locking model parameters of the first model to be unchanged, and training the first graph neural network model based on a preset first loss function to obtain a first optimization network model corresponding to the encoder and a second optimization network model corresponding to the decoder; releasing model parameters of the first model, and continuing to train the first graph neural network model based on the first loss function to obtain a final queuing network model. According to the method, the trained processor is utilized, the encoder and the decoder are trained by utilizing the field data, training is performed again according to the field data, and relatively high prediction accuracy can be achieved by utilizing less data training.

Description

technical field [0001] The invention belongs to the field of queuing optimization processing, and in particular relates to a queuing network model training method, queuing optimization method, equipment and media. Background technique [0002] Generally, queuing theory algorithms are used to construct queuing mathematical models to deal with queuing problems. However, the assumptions in queuing theory on the probability distribution of arriving data (such as Poisson distribution) may be due to the long-tailed distribution of arriving data at the scene. The on-site conditions do not match, leading to calculation errors. Moreover, using manual analysis of on-site data and confirming the characteristics of the site, first, it cannot automatically make adjustments according to changes in the site conditions (such as: changes in the queuing sequence, adjustments to the number of queues); second, it cannot automatically establish a system that can be deployed in practice. A very ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08G06K9/62
CPCY02P90/30
Inventor 郝霖王国权叶德建
Owner SHANGHAI CLEARTV CORP LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products