Task unloading method for deep learning application in edge computing environment

An edge computing and deep learning technology, applied in the field of edge computing and deep learning, can solve the problems of ineffective use of terminal computing power, prolonged task execution time, and increased edge load, etc. low-complexity effects

Active Publication Date: 2019-10-18
SOUTHEAST UNIV
View PDF5 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented technology improves on existing methods where tasks are processed through cloud servers or other devices like laptops without being able to achieve accurate inferences from their data processing capabilities. It allows users to execute more complex computations faster than they could otherwise have done beforehand because there're no strict limitations imposed upon these systems. Additionally, this new approach enables multiple types of computation (such as depth training), making them suitable for various scenarios such as autonomous vehicles or robotic agents working alongside humans). Overall, this innovation enhances efficiency and effectiveness across diverse domains including computer science researchers, medical professionals, educators, etc., especially when dealing with highly sensitive datasets involved in Deep Learning studies.

Problems solved by technology

Technological Problem addressed in this patented text relates to improving the execution experience (the ability to execute deep learning simulations) of mobile phone apps without slowdown delays caused by traditional CPU/GPU based systems used for analyzing image datasets. Current approaches like Deep Learning Computation Platforms require expensive hardware components and computational memory, leading to wastage of computation cycles and increased response times when performing computations. Additionally, while conventional edge computing techniques allow for accurate inferencing across multiple layers, they also result in excessively redundant calculations being performed overnight, further increasing their impact on battery consumption and communication bandwidth usage.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Task unloading method for deep learning application in edge computing environment
  • Task unloading method for deep learning application in edge computing environment
  • Task unloading method for deep learning application in edge computing environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The technical solutions and beneficial effects of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0059] The present invention provides a task offloading method for deep learning applications in an edge computing environment, which includes four parts, namely building an edge computing execution framework, collecting system information, offloading modeling analysis, and task offloading decision-making. The specific implementation method is as follows:

[0060] In the part of building the edge computing execution framework, cooperate with figure 1 As shown, the present invention combines the idea of ​​deep neural network branch network to build an edge computing execution framework for deep learning applications, including three logical steps of model training, task offloading, and task execution. During model training, the deep neural network is split into three cascadable model blocks, and distributed in differe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a task unloading method for a deep learning application in an edge computing environment, and the method comprises the steps: firstly dividing a deep neural network into a plurality of model blocks, collecting related data in a system, and analyzing corresponding features; taking the obtained characteristic data as an input parameter, establishing an M/M/n queuing model, and obtaining expectation of the average task number of a terminal device layer and an edge server layer, and expectation of task completion time when the task starts to be executed on the terminal device and the task is directly unloaded to the edge server to start to be executed; taking the two expected minimum and maximum values as a target function, and constructing an optimization model for minimizing task execution time; and solving the optimization model by using a heuristic algorithm to obtain an optimal unloading scheme. According to the method, a multi-mode and fine-grained personalized task unloading scheme can be provided for different deep learning applications, the task completion time is minimized, and the resource utilization rate of the terminal equipment is improved, so that the application requirements of high precision and low delay are met.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products