Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Distributed stochastic gradient descent method

A stochastic gradient descent, distributed technology, applied in the field of machine learning, can solve the problems of long waiting time for recycling, poor model parameters, etc., and achieve the effect of shortening the whole training time, accelerating the convergence speed, and efficient training process

Active Publication Date: 2020-03-27
TONGJI UNIV
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] The purpose of the present invention is to provide a distributed stochastic gradient descent method to overcome the disadvantages of collecting some poor model parameters and spending a lot of waiting time to recycle and release parameters in the heterogeneous network existing in the above-mentioned prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed stochastic gradient descent method
  • Distributed stochastic gradient descent method
  • Distributed stochastic gradient descent method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0054] This embodiment provides a distributed stochastic gradient descent method, such as figure 1 shown, including the following steps:

[0055] Step S1: The parameter server obtains the initial global gradient;

[0056] Step S2: Based on the initial global gradient and the initial task assignment strategy of the working node, the working node calculates the gradient of the working node;

[0057] Step S3: The parameter server obtains the gradient of the working node, and performs calculations to obtain an updated global gradient;

[0058] Step S4: Based on the updated global gradient and blockchain technology, the parameter server obtains the optimal gradient and the updated task allocation strategy of the working nodes;

[0059] Step S5: The optimal gradient is saved in the parameter cache of the working node;

[0060]Step S6: Update the task allocation strategy to replace the initial task allocation strategy, the optimal gradient to replace the initial global gradient, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a distributed random gradient descent method. The method comprises the following steps that S1, a parameter server obtains an initial global gradient; s2, based on the initialglobal gradient and an initial task allocation strategy of the working node, the working node calculates and obtains a working node gradient; s3, the parameter server obtains a working node gradient,and calculates and obtains an updated global gradient; s4, the parameter server obtains an optimal gradient and an update task allocation strategy of the working node based on the update global gradient and the block chain technology; s5, the optimal gradient is stored in a parameter cache of the working node; s6, the task allocation strategy is updated to replace the initial task allocation strategy, the initial global gradient is replaced with the optimal gradient, and the steps S2-S6 are repeated until the weight converges. Compared with the prior art, poor model parameters are prevented from being collected, the convergence speed of the model is increased, and the whole-process training time is shortened.

Description

technical field [0001] The invention relates to the field of machine learning, in particular to a distributed stochastic gradient descent method. Background technique [0002] Today, people have realized the great advantages of artificial intelligence technology in many fields. Machine learning is an indispensable work in artificial intelligence. It helps people make judgments and decisions by abstracting and modeling massive data. At the same time, blockchain 3.0 has also been implemented as an application. Its concept has surpassed the fields of digital currency and smart contracts, and can be used as an important supporting technology for the interaction of massive data information. [0003] The rapid growth of massive data is also accompanied by a sharp demand for more complex models (possibly with billions of parameters) to support higher accuracy in orders of magnitude data and solve other intelligent tasks (such as unmanned driving, image processing, etc.) situation...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N20/00
CPCG06N20/00
Inventor 杨恺张春炯王钰皓
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products