Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sliding window sampling-based distributed machine learning training method and system thereof

A machine learning and sliding window technology, applied in machine learning, instruments, computing models, etc., can solve the problem of poor stability and convergence effect of distributed asynchronous training, inability to perceive the context information of the expired degree of the learner gradient, and expired gradient processing Too simple and other problems to achieve the effect of alleviating poor training convergence, reducing training fluctuations, and improving robustness

Inactive Publication Date: 2017-05-31
SHANGHAI ADVANCED RES INST CHINESE ACADEMY OF SCI
View PDF0 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the shortcomings of the prior art described above, the purpose of the present invention is to provide a distributed machine learning training method and system based on sliding window sampling, which is used to solve the problem that the gradient expiration degree of the learner in the prior art cannot be perceived. The context information of the gradient expiration degree of the controller is too simple to deal with the expired gradient, which leads to the problem of poor stability and convergence effect of distributed asynchronous training

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sliding window sampling-based distributed machine learning training method and system thereof
  • Sliding window sampling-based distributed machine learning training method and system thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] see figure 1 , is a flow chart of a distributed machine learning training method based on sliding window sampling in an embodiment of the present invention. Method of the present invention comprises the following steps:

[0027] S1, machine learning model parameter initialization;

[0028] S2, obtain a data slice of all data, and perform model training independently;

[0029] S3, collect several rounds of gradient expiration degree samples in the history, sample samples by sliding, and calculate the gradient expiration degree context value, adjust the learning rate and initiate a gradient update request;

[0030] S4, asynchronously collect multiple gradient expiration degree samples, use the adjusted learning rate to update the global model parameters and push the updated parameters;

[0031] S5, asynchronously obtain the pushed global parameter update, and continue the next training;

[0032] S6, check model convergence, if not convergent, enter the step 2) loop; i...

Embodiment 2

[0041] In addition, the present invention also provides a distributed machine learning training system based on sliding window sampling using the above method, please refer to figure 2 , the system includes: a server node 1, the server node 1 collects several gradient update requests asynchronously, updates and saves the global model parameters, and passively pushes updated parameters to the client; a learner node 2, each of the learning The server node 2 obtains a data slice of all the data, and performs model training independently. After each round of training, it uses the adjusted learning rate to initiate a gradient update to the server node 1, and asynchronously obtains the update pushed by the server node 1. Parameters, initiate the next round of training; sliding sampling module (not shown), the sliding sampling module is attached to the learner node 2, used to complete the sampling of the previous rounds of gradient expiration degree samples, and calculate the gradien...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a sliding window sampling-based distributed machine learning training method and system thereof. The method comprises the steps of initializing parameters of a machine learning model; obtaining a data fragment of all data and independently carrying out model training; collecting multiple rounds of historical gradient expiration degree samples, sampling the samples through sliding, calculating a gradient expiration degree context value, adjusting the learning rate and then initiating a gradient update request; asynchronously collecting the multiple gradient expiration degree samples, updating global model parameters by using the adjusted learning rates and pushing updated parameters; asynchronously obtaining pushed global parameters for updating, and further carrying out next training; checking the model convergence, if the model is not convergent, carrying out model training cycle; and if the model is convergent, obtaining model parameters. The learning rate of a learning device is controlled by using the expiration gradient, the stability and the convergence effect of distributed training are improved, the training fluctuation caused by the distributed system is reduced and the robustness of distributed training is improved.

Description

technical field [0001] The present invention relates to large-scale machine distributed training, in particular to a distributed machine learning training method and system based on sliding window sampling. Background technique [0002] Modern neural network architectures trained on large data sets can achieve impressive results across a wide variety of domains, ranging from speech and image recognition, natural language processing, to industry-focused applications such as fraud detection and recommendation systems and other aspects. However, training these neural network models has strict computational requirements. Although significant progress has been made in GPU hardware, network architecture, and training methods in recent years, the fact is that on a single machine, the time required for network training is still long. unrealistic. Fortunately, we are not limited to a single machine: a great deal of work and research has made efficient distributed training of neural...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N99/00
CPCG06N20/00
Inventor 田乔许春玲李明齐
Owner SHANGHAI ADVANCED RES INST CHINESE ACADEMY OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products