Deep neural network optimizing method based on coevolution and back propagation

A deep neural network and back-propagation algorithm technology, applied in neural learning methods, biological neural network models, etc., can solve problems such as easy to fall into local optimal solutions, and achieve improved classification accuracy, improved optimization speed, and good optimization performance. Effect

Inactive Publication Date: 2017-05-10
XIDIAN UNIV
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] One of the current optimization algorithms for deep neural networks is: gradient-based backpropagation algorithm, which tends to fall into a local optimal solution as the complexity of netw

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network optimizing method based on coevolution and back propagation
  • Deep neural network optimizing method based on coevolution and back propagation
  • Deep neural network optimizing method based on coevolution and back propagation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] The method for optimizing a deep neural network based on co-evolution and backpropagation provided by the present invention will be described in detail below in conjunction with the accompanying drawings and embodiments.

[0055] The present invention proposes a deep neural network optimization method based on co-evolution and backpropagation, comprising the following steps:

[0056] Step 101: start the deep neural network optimization method based on co-evolution and backpropagation;

[0057] Step 102: Set a deep neural network structure, use L i Denotes the i-th layer of the network, N i Indicates the number of nodes in the i-th layer, initializes the weight W and bias b, sets the learning rate η, and customizes the parameter H;

[0058] Step 103: Input training samples to the deep neural network in step 102, and then use the backpropagation algorithm to train the deep neural network until the iterative error change value σ for two consecutive iterations of the deep...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a deep neural network optimizing method based on coevolution and back propagation, belongs to the technical field of combination of deep learning and evolutionary computation, and mainly aims to solve the problem of tendency to fall into a local optimal solution during training of the deep neural network. The deep neural network optimizing method comprises the following steps: (1) firstly, optimizing a network by using a back propagation algorithm; (2) when a stopping condition is satisfied, optimizing by using a co-evolution algorithm; (3) continually repeating the previous steps until an iterative stopping condition is satisfied; and (4) performing final optimization to obtain a weight and a deviation as optical parameters. In the deep neural network optimizing method, the advantages of an evolution algorithm are applied to the training of the deep neural network, and large-scale parameters are optimized by coevolution. Meanwhile, a selection strategy is designed in conjunction with the back propagation algorithm in order to increase the optimizing speed of the coevolution, so that the whole network can be trained more efficiently, high optimizing performance is achieved, and the classification accuracy of the network is increased.

Description

technical field [0001] The invention belongs to the combination of deep learning and evolutionary algorithm field, mainly solves the parameter optimization problem of deep neural network, specifically provides a deep neural network optimization method based on collaborative evolution and back propagation, and realizes the optimization of deep neural network parameters. Background technique [0002] Since the 1980s, the neural network (Neural Network, NN) has entered the fast lane of development. The major breakthrough of new scientific theories and the rapid development of high-performance computers have brought NN back to life. In 1982, Professor Hopfield of California Institute of Technology proposed the famous Hopfield neural network model, which strongly promoted the research of NN. In the mid-1980s, Ackley, Hinton and Sejnowski introduced a random mechanism based on the idea of ​​simulated annealing in the Hopfield NN model, and based on this model analyzed the differen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/08
CPCG06N3/084
Inventor 公茂果马晶晶赵昆刘嘉李豪张普照王善峰武越
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products