Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network training method based on multiple pre-training

A deep neural network and neural network technology, applied in the field of deep neural network training based on multiple pre-training, can solve problems such as single

Active Publication Date: 2016-12-07
THE 28TH RES INST OF CHINA ELECTRONICS TECH GROUP CORP
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, there are some problems that require the initial value of the weight of the deep neural network to be compound, that is, the initial value of the weight of the deep neural network is required to have multiple characteristics, not just a single characteristic

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network training method based on multiple pre-training
  • Deep neural network training method based on multiple pre-training
  • Deep neural network training method based on multiple pre-training

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0108] The following uses a three-hidden-layer neural network as an example to introduce the application of the present invention in a handwritten character recognition system. The network structure is as follows: Figure 4 Shown.

[0109] Step A: Divide the data into training set (containing 50,000 handwritten character pictures), verification set (containing 10,000 handwritten character pictures), and test set (containing 10,000 handwritten character pictures);

[0110] Step B: The size of the unified picture is 24 pixels * 24 pixels;

[0111] Step 2 A: Randomly select the weight matrix W between the input layer and the first hidden layer (1) , Input variable offset b (1) And hidden variable bias c (1) ;

[0112] Second step B: Select 10 samples from the training set as input variables, use the contrast divergence algorithm, and adopt the parameters agreed in the present invention to determine the weight matrix, input variable bias, and implicit value between the input layer and the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep neural network training method based on multiple pre-training. According to the method, the pre-training process of a deep neural network is completed by performing pre-training with a restricted Boltzmann machine and a denoising auto-encoder acting as models in turn. With application of the method, the result of pre-training, i.e. the initial value of the deep neural network parameters, is enabled to have the characteristics of different models so that a better initial model is provided for later training. The method only restricts the value of the parameters of the beginning of different pre-training phases so that the method can be extended to other layer-by-layer pre-training models, such as the auto-encoder and a contraction auto-encoder, etc.

Description

Technical field [0001] The invention belongs to the technical field of deep neural networks, and in particular relates to a deep neural network training method based on multiple pre-training, Background technique [0002] Deep neural networks are widely used in image recognition, speech recognition, natural language processing and other fields. The current training method for deep neural networks is to pre-train a single model to obtain the initial values ​​of network weights with certain characteristics, and then use back propagation Algorithm to train the network. Since different pre-training models have different characteristics, it is necessary to try different pre-training models in the actual application process, and then comprehensively compare the performance of the initial network obtained by each model, and finally determine the ideal pre-training model. [0003] However, there are some problems that require the initial value of the weight of the deep neural network to b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08
CPCG06N3/08
Inventor 胡振薛竹隐贺成龙宗士强葛唯益朱冰李霄徐琳俞露王羽姜晓夏
Owner THE 28TH RES INST OF CHINA ELECTRONICS TECH GROUP CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products