Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network multi-task hyper-parameter optimization method and device

A technology of deep neural network and optimization method, applied in the field of multi-task hyperparameter optimization of deep neural network, can solve the problem of large amount of calculation of hyperparameter optimization, and achieve the effect of speeding up learning and avoiding the amount of calculation.

Inactive Publication Date: 2019-11-12
SHENZHEN UNIV
View PDF0 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to overcome the deficiencies of the prior art, one of the purposes of the present invention is to provide a deep neural network multi-task hyperparameter optimization method, which can solve the problems of large amount of calculation for hyperparameter optimization in the prior art
[0007] The second object of the present invention is to provide an electronic device, which can solve the problems of large amount of calculation in hyperparameter optimization in the prior art
[0008] The third object of the present invention is to provide a computer-readable storage medium, which can solve the problems of large amount of calculation in hyperparameter optimization in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network multi-task hyper-parameter optimization method and device
  • Deep neural network multi-task hyper-parameter optimization method and device
  • Deep neural network multi-task hyper-parameter optimization method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] Aiming at the problem that the existing Bayesian optimization algorithm can only optimize a single task and cannot learn information between related tasks, the present invention proposes a multi-task learning network model (Multi-Task Learning, MTL) applied to Bayesian In the Yeesian optimization algorithm, multiple tasks are learned at the same time. Compared with the single-task learning method, each task can learn relevant information on other related tasks to promote its own learning and learn more feature information. In addition, the present invention also uses Radial Basic Function (RBF) neural network instead of the traditional Gaussian model for model training, which can reduce the amount of calculation and speed up the learning speed.

[0045] That is to say, the present invention connects the output of the radial basis neural network corresponding to multiple tasks through a fully connected layer, so that the information of multiple tasks can be shared, and th...

Embodiment 2

[0125] The present invention also provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and operable on the processing, when the processor executes the program, a deep neural network as described herein is implemented. Steps of a network multi-task hyperparameter optimization method.

Embodiment 3

[0127] The present invention also provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the steps of a deep neural network multi-task hyperparameter optimization method as described herein are realized.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep neural network multitask hyper-parameter optimization method. The method comprises: firstly, a data training set of each task being subjected to model training to obtaina multi-task learning network model; secondly, predicting all points in an unknown region, screening candidate points from a prediction result, finally evaluating the screened candidate points, adding the candidate points and target function values of the candidate points into the data training set, and establishing a model, predicting, screening and evaluating again; and so on, until the maximumnumber of iterations is reached, finally selecting a candidate point corresponding to the maximum target function value from the data training set, that is, the hyper-parameter combination of each task in the multi-task learning network model. According to the method, the Gaussian model is replaced by the radial basis function neural network model, and the radial basis function neural network model is combined with multi-task learning and is applied to the Bayesian optimization algorithm to realize hyper-parameter optimization, so that the calculation amount of hyper-parameter optimization isgreatly reduced. The invention further discloses an electronic device and a storage medium.

Description

technical field [0001] The invention relates to hyperparameter optimization, in particular to a deep neural network multi-task hyperparameter optimization method, electronic equipment and a storage medium. Background technique [0002] For the optimization of hyperparameters, in most cases, manual tuning is used. For example, experienced professional scholars can set a better value for hyperparameters based on their years of experience, or continuously train the model and then adjust the value of hyperparameters based on experience. For beginners, you can only spend a lot of time tuning hyperparameters. Therefore, the automatic adjustment of hyperparameters has attracted the attention of many researchers. The earliest automatic parameter adjustment methods are grid search and random search methods. [0003] Grid search is a brute force search method. First, it is necessary to determine the range of multiple hyperparameters, and then combine multiple candidate solutions thr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/084G06N3/045
Inventor 骆剑平陈娇
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products