Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Edge computing platform container deployment method and system based on load prediction

A load prediction and edge computing technology, applied in computing, neural learning methods, instruments, etc., to avoid resource consumption, reduce container migration, and avoid feedback lag

Active Publication Date: 2019-09-13
XI AN JIAOTONG UNIV
View PDF9 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is to provide a container deployment method and system for an edge computing platform based on load prediction in view of the deficiencies in the above-mentioned prior art, and to solve the problem that the current container deployment algorithm only uses the current load information of the node to cause feedback lag. That is, by predicting the resource occupancy rate of the node in the recent period, on the premise of ensuring that the original tasks of the node are running well, the container is reasonably deployed to each computing node

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Edge computing platform container deployment method and system based on load prediction
  • Edge computing platform container deployment method and system based on load prediction
  • Edge computing platform container deployment method and system based on load prediction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] Since the computing nodes of edge computing are often non-dedicated, there is already a long-running task on the node before the computing task is sent to the node. This task is called the original task, and the resource occupation caused by it is called the original load. .

[0036] see figure 1 The edge computing platform container deployment system based on load forecasting provided by the present invention includes an original load monitoring system, a node load forecasting system and a computing task management system. The original load monitoring system runs on computing nodes for a long time, and the node load forecasting system and computing task The management system runs on the central node for a long time. Multiple computing nodes are connected to the node load forecasting system to send the original load information of the nodes to the node load forecasting system. The node load forecasting system receives the original load information of the nodes and manag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an edge computing platform container deployment method and system based on load prediction. The edge computing platform container deployment system comprises a plurality of computing nodes and a central node, wherein original load monitoring systems are carried on the computing nodes; the original load monitoring systems are respectively connected with a node load prediction system, and are uploaded to a central server through the node load prediction system; the node load prediction system and a computing task management system are carried on the central node; an LSTMmodel corresponding to each computing node is arranged on the node load prediction system, and the node load prediction system receives node original load information and sends a prediction result tothe computing task management system; and the computing task management system is responsible for deploying the container, and the computing task management system feeds back the node number and the task time to the node load prediction system according to the received information and issues the container to the available computing node. According to the edge computing platform container deployment method, the containers are reasonably deployed to the computing nodes, so that the cost of computing tasks is reduced.

Description

technical field [0001] The invention belongs to the technical field of edge computing platforms, and in particular relates to a container deployment method and system for an edge computing platform based on load prediction. Background technique [0002] In recent years, with the development of the mobile Internet, the amount of Internet data has shown explosive growth, and more and more Internet services are also based on the analysis of big data. These have led to a rapid increase in the demand for computing resources. The computing power of a single computer can no longer meet the demand. So cloud computing came into being. Cloud computing is the product of the integration of traditional computer and network technologies such as distributed computing, parallel computing, virtualization, and load balancing. Cloud computing virtualizes a large number of servers into computing resource nodes through virtual machine technology. Users do not need to care about hardware imple...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/455G06N3/04G06N3/08
CPCG06F9/45533G06N3/08G06F2009/4557G06N3/045
Inventor 伍卫国康益菲徐一轩杨傲崔舜
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products