A neural network migration method based on shallow learning

A neural network and neural network model technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as unclear migration layers, difficult migration methods, and increased method complexity, and achieve memory graphics card resources. The effect of less demand, simple and efficient transfer tasks, and simple and efficient transfer learning

Active Publication Date: 2019-04-02
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF6 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] The problems in the existing methods are: 1) When using the pre-trained source task deep neural network, it is necessary to ensure that the gap between the source task and the current task is not large, otherwise the effect of migration learning will be poor, which increases the complexity of the method and It is not easy to guarantee the effect
2) The pre-trained source task deep neural network needs to use a large amount of labeled data and massive graphics card resources, and the purpose of transfer learning is to reduce the demand for massive labeled data and graphics card resources when training a new target task deep neural network. The deep neural network that solves the source task also needs these resources for pre-training the deep neural network of the source task. In fact, this method does not fundamentally solve the large demand for resources to use deep learning to complete the target task.
3) When using the end-to-end migration of the pre-trained source task deep neural network to another deep neural network, the traditional migration deep network is not clear about the number of migration layers, and it is often necessary to try out a better migration effect through the programmer's own experience. We have no way of knowing how many first layers of the deep network need to be transferred to optimize the training effect of the final target task network. The extent to which it affects the prediction results of the target task

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A neural network migration method based on shallow learning
  • A neural network migration method based on shallow learning
  • A neural network migration method based on shallow learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] A neural network migration method based on shallow learning provided by a preferred embodiment of the present invention is applied to image recognition tasks, and the steps of the method are as follows:

[0057] Step 1. Preprocess the target task data set: divide image recognition related tasks, form a task dictionary, mark the classified target tasks, store the marked data, and use it as the training data of the shallow neural network x 0 . The attributes and characteristics of objects of the same type are basically the same, for example, an animal has a head plus limbs, a vehicle has wheels and a carrier, and so on. This step is specifically:

[0058] 1) The images on the open source data set ImageNet are roughly divided into image recognition tasks by identifying different objects: animals, plants, buildings, roads, landscapes, objects, vehicles, and text; and record these categories as 1 to 8 . The images are sorted into eight folders according to these categorie...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a neural network migration method based on shallow learning, and the method comprises the steps: 1 carrying out the classification and division of a target task data set, carrying out the marking of the target task data set, and storing the marking data as the training data x0 of a shallow neural network; 2 inputting x0 to a shallow neural network, training layer by layer to obtain a pre-trained shallow neural network model, and outputting data x2 after x0 passes through the pre-trained neural network model; and 3 taking the obtained output data x2 of the pre-trained shallow neural network model as the input of the deep neural network model of the target task, training the whole deep network by using the marked data of the target task, and carrying out fine tuning on the whole network parameters to complete neural network migration. According to the method, the shallow neural network learning model trained layer by layer is used as a basic model of task migration, so that the migration task is simple and efficient, the expansibility is high, and the problem that the migration effect of the traditional end-to-end deep neural network is uncertain in fluctuation and even appropriate for reversibility is solved.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a neural network transfer method based on shallow learning. Background technique [0002] Shallow learning: that is, using low-level neural networks for machine learning. It is a method for machine learning to perform representation learning on data. Build a learning structure with a few hidden layers to learn attribute categories or features to discover distributed feature representations of data. Shallow supervised neural networks are easier to interpret, analyze, and optimize than deep networks, but have less representational power than deep networks. [0003] Migration learning: that is, to transfer the learned and trained model parameters to the new model to help the new model training. Considering that most of the data or tasks are related, through migration learning we can share the learned model parameters (also understood as the knowledge learned b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/084G06N3/045
Inventor 牛新征刘鹏飞徐畅李柯江朱家辉陈加伟朱戈潘袁湘
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products