Model training method and device, server and storage medium

A model training and server technology, applied in the field of model training, can solve problems such as expensive, poor generalization performance of algorithm models, difficulties, etc., achieve the effect of alleviating dependence, alleviating model training's dependence on real sampled data, and accelerating application implementation

Pending Publication Date: 2020-07-10
深圳市凌雀智能科技有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If there is not enough data, the generalization performance of the algorithm model may be poor
However, it is already difficult and expensive to obtain large amounts of data, and it is even more difficult to require this data to be labeled

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method and device, server and storage medium
  • Model training method and device, server and storage medium
  • Model training method and device, server and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] figure 1 The flow chart of the model training method provided by Embodiment 1 of the present invention specifically includes the following steps:

[0043] Step 110, sample set IP k Input training model set AP k Perform training to obtain the kth training result, the kth training result includes the training model set AP k+1 and the kth sample label, the kth sample label is a training error label;

[0044] In this embodiment, when the k=1, the sample set IP k is the initial sample, the training model set AP k for the initial training model. In some embodiments, the initial training model and initial samples can also be generated in a completely random manner, or the initial model and initial samples can be heuristically generated using prior knowledge, and the one-time generation does not exceed N AP An arbitrary positive integer number of initial training models and no more than N AP An arbitrary positive integer number of initial samples.

[0045] In this embod...

Embodiment 2

[0057] figure 2 The flow chart of the model training method provided by Embodiment 2 of the present invention specifically includes the following steps:

[0058] Step 210, sample set IP k Input training model set AP k Perform training to obtain the kth training result, the kth training result includes the training model set AP k+1 and the kth sample label, the kth sample label is a training error label;

[0059] In this embodiment, when the k=1, the sample set IP k is the priori known initial sample, the training model set AP k is the priori known initial training model. In some embodiments, the initial training model and initial samples may also be generated in a completely random manner.

[0060] In this embodiment, the sample labeling refers to labeling the samples by using the size of the error during training during the training process of the samples. In this embodiment, the calibrated error is the ratio of misclassified samples of the training model set during t...

Embodiment 3

[0076] The model training device provided by the embodiment of the present invention can execute the model training method provided by any embodiment of the present invention, see image 3 , the model training device 3 includes a training module 31 , a target sample set confirmation module 32 and a sample set generation module 33 .

[0077] Training module 31 is used for sample set IP k Input training model set AP k Perform training to obtain the kth training result, the kth training result includes the training model set AP k+1 and the kth sample label, the kth sample label is a training error label;

[0078] In this embodiment, when the k=1, the sample set IP k is the priori known initial sample, the training model set AP k is the priori known initial training model.

[0079] The target sample set confirmation module 32 is used to confirm the sample set IP according to the kth sample label k Target sample set IP in n ;

[0080] In this embodiment, specifically, the t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a model training method and device, a server and a storage medium. The model training method comprises the steps: inputting a sample set IPk into a training model set APk to betrained to obtain a k-th training result, the k-th training result comprises a training model set APk + 1 and k-th sample annotation, and the k-th sample annotation is training error annotation; determining a target sample set IPn in the sample set IPk according to the kth sample annotation; generating a sample set IPk + 1 according to the target sample set IPn; inputting the sample set IPk + 1 into the training model set APk + 1 for training to obtain a (k + 1)-th training result, wherein the (k + 1)-th training result comprises a training model set APk + 2, wherein k is equal to n, and k and n are both positive integers greater than or equal to 1. According to the method, the problem of high dependence of model training on samples is solved, the dependence of model training on real sampling data is relieved, and the effect of accelerating application of an artificial intelligence technology is achieved.

Description

technical field [0001] Embodiments of the present invention relate to model training technology, and in particular, to a model training method, device, server, and storage medium. Background technique [0002] In recent years, artificial intelligence has made breakthrough progress, and has been widely used in real life, providing efficient solutions for all walks of life that are comparable to or even surpass human capabilities. At the heart of the AI ​​approach is model training. The training of current mainstream artificial intelligence models, such as Generative Adversarial Networks (GANs), usually requires a large amount of labeled data as input to the training process. If there is not enough data, the generalization performance of the algorithmic model may be poor. However, it is already difficult and expensive to obtain large amounts of data, and it is even more difficult to require this data to be labeled. Therefore, there is an urgent need for a method that can pe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08
CPCG06N3/08
Inventor 杨鹏唐珂姚新
Owner 深圳市凌雀智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products