Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network optimization method for solving and lifting adjacent operator machine

A neural network and optimization method technology, applied in neural learning methods, biological neural network models, etc., can solve problems such as learning efficiency difficulties, achieve the effects of avoiding gradient disappearance or explosion, high accuracy, and improving training effects

Active Publication Date: 2020-04-24
PEKING UNIV
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It is very difficult to choose an appropriate learning efficiency when updating weights using gradient descent

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network optimization method for solving and lifting adjacent operator machine
  • Neural network optimization method for solving and lifting adjacent operator machine
  • Neural network optimization method for solving and lifting adjacent operator machine

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] Below in conjunction with accompanying drawing, further describe the present invention through embodiment, but do not limit the scope of the present invention in any way.

[0063] The present invention provides a neural network optimization method for solving the lifting adjacent operator machine. In the training of the forward neural network, a new method of block coordinate descent is used to solve the LPOM model, and the convergence is guaranteed for each sub-problem in the LPOM model. , which can update variables in parallel to improve the accuracy of neural network training without occupying additional memory space. The neural network optimization method proposed by the invention can be applied to specific tasks such as image recognition, speech recognition and natural language processing.

[0064] The following takes image recognition as an example to describe the specific implementation and compare it with the current best results. The method of the present inve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network optimization method for solving and lifting an adjacent operator machine, and relates to the technical field of deep learning neural network optimization. In the training of a forward neural network, a block coordinate descent method is adopted to solve and improve an LPOM model of an adjacent operator, each sub-problem in the LPOM model has convergence, the weight and network activation of each layer of the neural network can be updated in parallel, and extra memory space is not occupied. By adopting the method of the invention, the parallelism, the applicability and the training effect of neural network training can be improved under the condition of using relatively less storage.

Description

technical field [0001] The present invention relates to the technical field of deep learning neural network optimization, in particular to a method for optimizing a neural network by solving a lifted proximal operator machine (LPOM). Background technique [0002] The feed-forward deep neural network is composed of hierarchical fully-connected layers, and there are no feedback connections. With recent developments in hardware and dataset size, feed-forward deep neural networks have become the standard for many tasks. For example, image recognition [16], speech recognition [12], natural language understanding [6] and as an important part of Go learning system [22]. [0003] In recent decades, the goal of optimizing feed-forward neural networks is usually a highly non-convex and nested function with respect to network weights. The main method for optimizing feed-forward neural networks is stochastic gradient descent (SGD) [21]. Its effectiveness is verified by its success in ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08
CPCG06N3/08Y02T10/40
Inventor 林宙辰李嘉方聪
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products