Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-version inference model deployment method, device and system in edge computing environment

An edge computing and multi-version technology, which is applied in the field of deployment of multi-version inference models, can solve the problems that the number of inference requests cannot be accurately known in advance and the inference accuracy is reduced.

Active Publication Date: 2020-07-28
NANJING UNIV +2
View PDF6 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When users infer too many requests, they are more inclined to deploy inference model instances that occupy less resources, but the inference accuracy generated by them will also decrease accordingly; on the contrary, when users infer not many requests, they can make full use of limited edge computing resources Deploy high-precision inference class model instances
[0004] However, since the number of user inference requests changes over time, and when providing inference services, it is necessary to deploy the model instance first, including opening the container, deploying the basic environment, and preparing corresponding data, etc., which cannot be accurate in advance. Know the actual number of future user inference requests, and only after these preparations can respond to user inference requests for a subsequent period of time
Existing scheduling strategies cannot adapt to the changing user inference requests online and dynamically under the condition of limited edge resources. Therefore, a new method for deploying multi-version inference models in an edge computing environment is needed to achieve flexibility. Scheduling to maximize the user's inference quality of service, i.e. inference accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-version inference model deployment method, device and system in edge computing environment
  • Multi-version inference model deployment method, device and system in edge computing environment
  • Multi-version inference model deployment method, device and system in edge computing environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The technical solutions of the present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0061] refer to figure 1 , in one embodiment, an edge computing inference model deployment system is applied in a mobile network, and the edge computing inference model deployment system includes: edge computing nodes, control nodes, and a network connecting each edge computing node, wherein the edge network consists of each edge Computing nodes are connected to the backhaul line of the core switch. The edge network allows the evacuation and migration of inference requests between edge computing nodes. At the same time, through the cooperation of the backbone network, the required inference model can be downloaded from the data center to the target edge node; The resources on each edge computing node are heterogeneous and limited, and the model instance of the inference class can be run within the scope allowed by the respecti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-version inference model deployment method, device and system in an edge computing environment. According to the method, an inference model instance deployment strategyon each edge computing node of the next time slot is continuously adjusted and updated according to the number of user inference requests arriving online, queues of to-be-processed inference requestsin each edge computing node at present and feedback of multi-version inference model instance deployment of the current time slot. According to the inference class model deployment system and deployment method, when the number of actual inference requests of a future user cannot be accurately obtained, by combining effect feedback of inference class model deployment after each deployment, periodicmulti-version model instance number adjustment is carried out on each edge node, and the effect of maximizing user inference precision in the edge environment is achieved.

Description

technical field [0001] The present invention relates to the field of edge computing, in particular to a method, device and system for deploying a multi-version inference model in an edge computing environment. Background technique [0002] Edge computing aims to shorten the round-trip delay for users to access cloud data centers, deploy services in edge computing nodes close to users, and then users can directly use various services deployed in edge computing nodes nearby. Machine learning inference service is a kind of inference model that has been trained in advance (which may be continuously updated and revised) to respond to user inference requests. Such inference models include deep learning models, decision tree models, and various regression models. and various clustering models. The characteristic of this type of inference model is that many different model versions will be generated during the training process. The difference between different model versions is th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F8/60G06F8/656G06F8/71
CPCG06F8/60G06F8/656G06F8/71
Inventor 金熠波钱柱中韦磊缪巍巍张明明曾锃张明轩
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products