Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A deep learning model tuning management method, device, equipment and medium

A technology of deep learning and management methods, applied in data processing applications, character and pattern recognition, instruments, etc., can solve the problems of low prediction accuracy of new model situations, affecting triggering, time-consuming and computing power, etc., to achieve automatic Retraining, the effect of realizing the management function

Active Publication Date: 2022-06-07
INSPUR SUZHOU INTELLIGENT TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] For the management of the retraining model, the retraining period and triggering situation are more important. Too many retraining times indicate that the model is difficult to meet the accuracy requirements of the manager, and retraining consumes time and computing power. If the number of retraining is too small, the model will not be updated for a long time, and the prediction accuracy for new situations that the model has not learned will be low, and the total number of prediction sets for each prediction is too large or too small, which will affect trigger for retraining

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A deep learning model tuning management method, device, equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] In order to make the objectives, technical solutions and advantages of the present invention more clearly understood, the embodiments of the present invention will be further described in detail below with reference to the specific embodiments and the accompanying drawings.

[0019] It should be noted that all expressions using "first" and "second" in the embodiments of the present invention are for the purpose of distinguishing two entities with the same name but not the same or non-identical parameters. It can be seen that "first" and "second" " is only for the convenience of expression, and should not be construed as a limitation on the embodiments of the present invention, and subsequent embodiments will not describe them one by one.

[0020] Based on the above purpose, in a first aspect of the embodiments of the present invention, an embodiment of a deep learning model tuning management method is proposed. figure 1 Shown is a schematic diagram of an embodiment of a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for tuning and managing a deep learning model, which includes: in response to the number of pictures in a data set that needs to be predicted is greater than a set number threshold, the data set is divided into several sub-data sets on average, and each sub-data set is processed by a deep learning model. The data set is predicted to obtain the first prediction data; the judgment data of each sub-data set is obtained through manual prediction and judgment; the Kalman filter is performed on the first prediction data and judgment data of each sub-data set to obtain the first prediction data of each sub-data set Two prediction data; obtain the error rate of the deep learning model according to the first prediction data and the second prediction data; trigger the retraining of the deep learning model in response to the error rate exceeding the error rate threshold. The invention also discloses a device, equipment and medium. The principle of the invention is easy to understand, easy to operate and realize, and the accuracy is guaranteed, and the function of automatic model retraining and tuning is optimized.

Description

technical field [0001] The present invention relates to the field of image processing, and more particularly, to a deep learning model tuning management method, apparatus, device and medium. Background technique [0002] At present, the retraining management of the prediction model can be divided into the following two types: one is to compare the prediction results of the model with the manual annotation results, and accumulate the prediction error results identified by humans, that is, the number of pictures. When the number of errors reaches a threshold, all the positive and negative sample sets accumulated in the previous prediction will be used as the training set to retrain the model (for example, if the threshold is 200 wrongly predicted pictures, the first 1000 pictures are predicted incorrectly by 100 pictures, and the second The second 2000 pictures are predicted wrong by 150 pictures, then after the second prediction is over, the previous 3000 pictures will be use...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06Q10/04G06V10/774
CPCG06Q10/04G06F18/214
Inventor 张书博
Owner INSPUR SUZHOU INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products