Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network increment-type feedforward algorithm based on sample increment driving

A neural network and incremental technology, applied in the algorithm field of online learning single hidden layer feedforward neural network, can solve the problems of poor prediction effect and stability of neural network, improve prediction accuracy and generalization ability, and satisfy dynamic optimization The effect of controlling requirements and improving stability

Inactive Publication Date: 2016-04-06
YANSHAN UNIV
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, all practical engineering application system modeling is based on time series data, and the input weights and hidden layer thresholds of most neural networks are randomly initialized, and the generated model parameters are consistent with the input sample data of the actual system. There is no inherent regularity, which leads to poor predictive effect and stability of the neural network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network increment-type feedforward algorithm based on sample increment driving
  • Neural network increment-type feedforward algorithm based on sample increment driving
  • Neural network increment-type feedforward algorithm based on sample increment driving

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The present invention will be further described below in conjunction with accompanying drawing:

[0023] Such as figure 1 As shown, the algorithm steps of the present invention are as follows:

[0024] Step 1. In the neural network, select any time L 0 training samples Initialize the model parameters; randomly set an m×n matrix P, m is the number of hidden layer nodes, and n is the number of input nodes. Compute Composite Matrix

[0025] Step 2, calculate the input weight and hidden layer threshold;

[0026] Step 3, and then calculate the hidden layer output matrix H 0 ;

[0027] Step 4. Calculate the output weight matrix β according to the least squares method and MP generalized inverse method 0 ; Set the parameter k=0, where k is the sequence number of samples added.

[0028] Step 5, introduce a new sample X 1 , calculate the L-th difference between the new sample and the initial training sample 0 Whether there is an increment between samples, if there is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network increment-type feedforward algorithm based on sample increment driving. According to characteristics of input samples, input weights and hidden layer thresholds of a neural network are obtained by a least square method, and initialization of model parameters is completed; and the input weights and hidden layer thresholds of the model are updated according to increments between new samples and old samples, function relations between the model parameters and input samples are established, and functions of sample self-adaptability and online feedforward adjustment are realized. The algorithm provided by the invention has the advantages that the prediction precision is high, the generalization capability is high, and online feedforward adjustment is available.

Description

technical field [0001] The invention relates to the field of online learning neural network, in particular to an algorithm for online learning single hidden layer feed-forward neural network. Background technique [0002] Since the 1980s, neural networks have derived a variety of model structures and learning rules, such as the earliest proposed BP neural network based on error backpropagation learning, radial basis neural network, discrete and continuous Hopfield neural network , SOM neural network, etc. The various neural networks described above can be used for both classification and regression problems, and all have achieved good model results. The extreme learning machine algorithm proposed in recent years has become a research hotspot. It belongs to a single hidden layer feedforward neural network. The single hidden layer feedforward neural network has good nonlinear identification ability, and has simple structure, few adjustable parameters, and learning With the a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04
CPCG06N3/044
Inventor 牛培峰马云鹏李国强武怀勤李霞
Owner YANSHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products