Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network asynchronous training-oriented learning rate adjustment method

A neural network and adjustment method technology, which is applied in the field of learning rate adjustment for asynchronous training of neural networks, can solve the problems of reduced network accuracy, difficulty in defining hyperparameters, and no exact theoretical basis for numerical setting, etc., to achieve learning rate balance, The effect of improving the network convergence speed

Pending Publication Date: 2021-05-28
SUN YAT SEN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] 1. There is no exact theoretical basis for numerical setting, but only heuristic setting
And the hyperparameters are also difficult to define and can only be selected through experiments.
[0011] 2. This type of method ignores other influencing factors, such as minibatch size, the impact of the delay of other gradients in the current minibatch
[0013] Because of these problems, these simple learning rate adjustment methods can only have good results when the number of workers is small, or the size of the minibatch calculated by each worker per round is extremely small.
Once these two conditions cannot be fully met, the accuracy of the network produced by training will be greatly reduced
The phenomenon is explained in detail in the experiment, and it is pointed out that the larger the value of (Number of workers N*batchsize of each worker), the worse the effect of asynchronous update

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network asynchronous training-oriented learning rate adjustment method
  • Neural network asynchronous training-oriented learning rate adjustment method
  • Neural network asynchronous training-oriented learning rate adjustment method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention will be further described below in conjunction with the accompanying drawings. It should be noted that this embodiment is based on the technical solution, and provides detailed implementation and specific operation process, but the protection scope of the present invention is not limited to the present invention. Example.

[0063] Such as figure 1 As shown, the present invention is a learning rate adjustment method for neural network asynchronous training, said method comprising the following steps:

[0064] S1 initialization parameters;

[0065] S2 sends the parameters of the neural network to all idle computing nodes: for all nodes that have completed the calculation in the previous cycle and submitted the calculation results, the parameter server will send them the updated parameters respectively, and let them start the next round The calculation of ; after this step, the whole enters the next round of calculation, the current round t glob = ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a learning rate adjustment method for asynchronous training of a neural network. The method comprises the following steps: initializing parameters; sending parameters of the neural network to all idle computing nodes until c calculation results are received; respectively adjusting the learning rates of the c received calculation gradients; s4, using the learning rate obtained in the step S4 and the received c gradients to perform one-step gradient descent update on the network; and judging whether the network precision meets the requirement. If the requirements are met, completing training, sending the response 2 to all the computing nodes, and carrying out exiting; and otherwise, returning to the step B, and sending the response 1 to all nodes which are calculated in the current round, and carrying out the training of the next round of circulation. The method has the beneficial effects that the learning rate of the delay gradient does not linearly rise along with the increase of the number, meanwhile, the delay condition of other currently received gradients and the sample batch size are considered during calculation, and the overall learning rate is adjusted to be more balanced and more scientific.

Description

technical field [0001] The invention belongs to the technical field of artificial intelligence-neural network optimization technology research, in particular to a learning rate adjustment method for asynchronous training of neural networks. [0002] technical background [0003] As the current data set is expanding day by day, the parameters of the trained model (such as deep neural network) are also increasing, stochastic gradient descent optimization (SGD) has become the core of the current supervised learning algorithm. [0004] This training method is composed of several rounds of optimization. In each cycle, randomly select several samples from the training set, let them pass through the neural network, and calculate the loss (LOSS) according to the difference between the neural network's calculation results and the actual results; then reverse the network according to this loss Propagate, calculate the gradient of each parameter in the neural network for the loss, and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06F17/16G06N3/04G06N3/08
CPCG06F17/16G06N3/084G06N3/045G06F18/214Y02D10/00
Inventor 李尹健卢宇彤
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products