Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for constructing and training dynamic neural network of incomplete recursive support

A dynamic neural network and neural network technology, applied in the field of incomplete recursive support dynamic neural network and its learning algorithms, can solve problems such as time-consuming training, time-consuming and labor-intensive, and inability to fundamentally overcome the BP algorithm.

Inactive Publication Date: 2016-03-23
CHANGAN UNIV
View PDF2 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the BP algorithm itself is slow, using the trial and error method is tantamount to making things worse, and it takes a lot of training time, so these methods are very time-consuming and laborious, and are prone to failure
Since the BP algorithm uses the gradient descent optimization method of the error function, there are inevitably problems of local minima, convergence sensitive to the initial value, and slow convergence.
Since the BP algorithm was circulated in 1986, many articles on improving the algorithm have been published. A large number of scholars have carried out extensive research around its shortcomings, such as changing the learning rate and increasing the inertial term, etc., although it has been improved to a certain extent. , but still cannot fundamentally overcome the essential shortcomings of the BP algorithm

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for constructing and training dynamic neural network of incomplete recursive support
  • Method for constructing and training dynamic neural network of incomplete recursive support
  • Method for constructing and training dynamic neural network of incomplete recursive support

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] In essence, neural networks can be divided into two categories: feed-forward networks without feedback and recurrent networks with feedback. Because there is no feedback, the multi-layer forward network has a simple structure and a mature learning algorithm; but it is essentially a static structure, which cannot describe the dynamic system well, which limits its use in neural network control. The feedback characteristics of the recurrent network make it have a good ability to express the dynamic system, such as the attractor dynamic structure, historical information storage and other characteristics, so that it has obtained great application advantages in the dynamic system. Unfortunately, due to the complex structure and dynamics of the fully connected dynamic recurrent neural network, there are too many bidirectional weight parameters, and the feedback recursion makes the learning process complicated and lengthy, which in turn restricts its wide application. Based on ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for constructing and training a dynamic neural network of an incomplete recursive support. With a local recurrent neural network as the function, the construction method is used for setting a transfer function and an activation function of an input layer and an output layer of the local recurrent neural network, a similar support function is defined to serve as the transfer function of a network hidden layer, and a training method of the improved network is provided. The dynamic neural network with partial feedback is provided by the invention, due to the structural change, the dynamic neural network can better reflect dynamic system properties, and the function approximation ability is strong; and the method can be used for accurately determining the neural network structure according to training sample data, quickly completing the training and satisfying the required error precision.

Description

technical field [0001] The invention relates to the field of computer artificial intelligence, in particular to an incomplete recursive support dynamic neural network and a learning algorithm thereof. Background technique [0002] In 1943, the neurobiologist MeCulloch and the young mathematician Pitts proposed the first artificial neuron model, and on this basis abstracted the mathematical model of neurons (usually referred to as the MP model), which opened the neural network research prologue. Since then, various neural network models have been proposed one after another. In 1986, Rumelhart and his research team proposed BP neural network and BP learning algorithm, which is the most popular and most influential artificial neural network. [0003] The BP neural network is constructed as follows: [0004] 1) The first layer of the network is the input layer, and the transfer or activation function is the identity transformation r(n)=n; [0005] 2) After that, there can be...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08
CPCG06N3/08
Inventor 吕进吕永红孙广成吕若琳
Owner CHANGAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products