Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof

A dynamic neural network and model training technology, applied in biological neural network models, neural learning methods, neural architectures, etc., can solve the problem of inability to effectively describe the internal input of the system, inability to guarantee the performance of neural network classification, convolution functions and pooling functions The design is complicated and difficult to achieve the effect of saving training time, reducing design difficulty and improving accuracy

Inactive Publication Date: 2017-12-15
SHANDONG NORMAL UNIV
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this model, the depth of the model and the design of the convolution function and the pooling function are extremely complicated and difficult problems. In addition, the current neural network and its various variants all model neurons as static neurons. The element can only describe the relationship between the input and the output, but cannot effectively describe the relationship between the system inputs, and cannot guarantee the classification performance of the neural network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof
  • Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof
  • Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] Based on the above dynamic neurons, this embodiment provides a training method for a dynamic neural network model based on integrated learning, including the following steps:

[0057] Step 1: The original data is used as the input of the neurons in the first layer of the i-th sub-model, and after being processed by the dynamic neural network, the corresponding output value is the feature of the layer, i=1,2,...,k;

[0058] Optionally, the original data can also be divided into a training set and a verification set, the training set is used for training the neural network model, and the verification set is used for subsequent model performance evaluation.

[0059] Step 2: Increase the number of neuron layers, use the output features of the upper layer as the input of the next layer of neurons to obtain the features of the corresponding layer, repeat this step until the number of layers reaches a certain preset value r;

[0060] The same layer in the same neural network h...

Embodiment 2

[0076]Based on the method of Embodiment 1, the present invention also provides a computer device, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the computer program being used to train the integrated dynamic neural network model, the processor executes the following steps when executing the program:

[0077] Step 1: For each sub-model in the integrated dynamic neural network model, the original data is used as the input of the neurons of the first layer, and the output characteristics of the layer are obtained through the processing of the dynamic neurons;

[0078] Step 2: Increase the number of neuron layers, use the output features of the upper layer as the input of the next layer of neurons to obtain the features of the corresponding layer, and repeat this step until the number of layers reaches a certain preset value;

[0079] Step 3: Establish a fully connected layer between the output feature of the last layer...

Embodiment 3

[0091] A computer-readable storage medium on which a computer program is stored for training an integrated dynamic neural network model, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the program being processed The following steps are performed when the server executes:

[0092] Step 1: For each sub-model in the integrated dynamic neural network model, the original data is used as the input of the neurons of the first layer, and the output characteristics of the layer are obtained through the processing of the dynamic neurons;

[0093] Step 2: Increase the number of neuron layers, use the output features of the upper layer as the input of the next layer of neurons to obtain the features of the corresponding layer, and repeat this step until the number of layers reaches a certain preset value;

[0094] Step 3: Establish a fully connected layer between the output feature of the last layer and the category to which it ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dynamic neutral network model training method based on ensemble learning and a dynamic neutral network model training device thereof. The method comprises the steps that as for each sub-model in an ensemble dynamic neutral network model, original data act as the input of a first layer of neurons, and the output characteristics of the layer can be obtained through processing of dynamic neurons; the number of layers of the neurons is increased, and the output characteristics of the previous layer act as the input of the next layer of neurons so as to obtain the characteristics of the corresponding layer, and the step is repeated until the number of layers reaches a certain preset value; a full connection layer is established between the output characteristics and the belonging class of the last layer, and the full connection weight between the output characteristics and the class is calculated; and the full connection layer is established between each sub-model and the belonging class, and the weight of each sub-model to the ensemble dynamic neutral network model is determined. A deep neural network is converted into multiple relatively shallow neural networks to be processed in parallel so that the training time can be saved and the training efficiency can be enhanced.

Description

technical field [0001] The invention relates to the fields of artificial intelligence and big data, in particular to a model training method and device for object classification. Background technique [0002] Artificial intelligence has become a research hotspot in the current society. From time to time, new research results will catch people's attention and become a hot favorite in today's society. Among them, the correct classification and identification of objects has become an important research direction of artificial intelligence. At present, object recognition has made considerable progress, largely due to artificial neural networks and their many variants, such as convolutional neural networks, recurrent neural networks, etc. The neural network trains its internal structure through a large amount of data to achieve a strong expression effect of the model. However, the increase in the amount of data further increases the difficulty of model training. In order to sol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/04
Inventor 王强张化祥孟庆田马学强任玉伟
Owner SHANDONG NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products