Mobile user deep neural network calculation unloading time delay minimization method

A deep neural network and mobile user technology, applied in the field of intelligent application computing offloading delay minimization, can solve the problem of increased failure rate of computing offloading, and achieve the effect of delay minimization

Active Publication Date: 2020-10-02
ZHEJIANG UNIV OF TECH
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

With the advent of the 5G era, the mobility of users cannot be ignored. Users often change the base statio...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile user deep neural network calculation unloading time delay minimization method
  • Mobile user deep neural network calculation unloading time delay minimization method
  • Mobile user deep neural network calculation unloading time delay minimization method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be further described below in conjunction with the accompanying drawings.

[0042] refer to Figure 1 to Figure 5 , a mobile user deep neural network calculation offloading delay minimum method, comprising the following steps:

[0043] Step 1: Divide the execution time required by the deep neural network into multiple unequal time periods. The principle of division is whether the base station connected to the user changes during the mobile process. Each base station is deployed with a cloud server, and the DNN request is set to send The moment τ start , the time when the task is completed is τ end, during this period, the time that the user stays in the communication area of ​​each base station is a time period, such as figure 1 As shown, the first time period starts from τ start start, τ 2 end, the second time period starts from τ 2 start, τ 3 end, the third time period starts from τ 3 start, τ end End;

[0044] Step 2: Model the DN...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a mobile user deep neural network calculation unloading time delay minimization method. A deep neural network-based intelligent application processed by a user in a moving state is analyzed, and a corresponding model is built for minimizing the time delay; and the deep neural network is modeled into a directed acyclic graph, and an optimal unloading decision is made by cutting the graph for multiple times; the cutting process is divided into two stages: in the first stage, only an optimization problem in one time slot is considered, an optimal cutting point is found, anetwork model in the time slot is divided into a front part and a rear part, the first part is subjected to local calculation, and the second part is unloaded to edge cloud calculation; in the secondstage, starting from the global perspective, the whole directed acyclic graph is cut into a plurality of blocks, and the number of time slots required for completing the whole DNN task is minimized. According to the method, the purpose of minimizing the DNN calculation time delay is achieved, cooperative cooperation of the mobile user and the edge cloud server is achieved, and meanwhile the continuity of task processing in the user moving process is guaranteed.

Description

technical field [0001] The invention belongs to the technical field of edge computing offloading, and in particular relates to a method for minimizing offloading time delay of intelligent application computing offloading based on a deep neural network in a mobile state of a user. Background technique [0002] The latest research results of Deep Neural Networks (DNN) have greatly improved the performance of DNN, and have been widely used in image recognition, intelligent search, language processing and other fields. Although the central processors of new mobile devices are becoming more and more powerful, they still cannot meet the delay requirements of DNN intelligent applications with extremely high real-time requirements. [0003] Edge computing is an effective way to solve the above problems. Mobile devices can hand over part or all of the computing tasks to edge cloud servers through computing offloading to solve the shortcomings of mobile devices in terms of resource s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04L29/08G06N3/04G06N3/08
CPCH04L67/10G06N3/049G06N3/08G06N3/045
Inventor 田贤忠朱娟许婷
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products