Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-class model training method based on gradient balance, medium and equipment

A model training, multi-category technology, applied in computational models, biological neural network models, character and pattern recognition, etc., can solve the problem that the model is difficult to learn difficult samples, reduce the model's ability to learn simple samples, etc., to improve the generalization ability. and accuracy, shortening training time, and improving accuracy

Pending Publication Date: 2021-04-09
成都艾特能电气科技有限责任公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For some tasks, too many simple samples will cover the influence of a small number of difficult samples on the model, making it difficult for the model to learn difficult samples; similarly, paying too much attention to difficult samples will also reduce the model's ability to learn simple samples
Therefore, there is currently no training strategy that can simultaneously learn simple samples and difficult samples, which also shows that the current mainstream training methods still need to be further improved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-class model training method based on gradient balance, medium and equipment
  • Multi-class model training method based on gradient balance, medium and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] Embodiments of the present invention provide a multi-category model training method based on gradient balance, which can be widely used in computer equipment, such as personal computers, cloud servers, computing clusters, or other electronic equipment that supports executable model training middle.

[0038] Such as figure 1 Shown, described training method comprises the steps:

[0039] S1: Obtain the loss function by dividing the training sample data into several batches and inputting it to the selected neural network model for training;

[0040] Wherein, the present embodiment adopts the OpenImage data set with multiple categories and large amount of data as a training sample, and the neural network model is a machine learning model based on the chain rule and stochastic gradient descent training methods, such as VGG, Inception, ResNet Convolutional neural network models.

[0041] S2: The gradient of the loss function when statistically training the model;

[0042]...

Embodiment 2

[0064] Based on the above model training method, Embodiment 2 of the present invention provides a computer-readable storage medium storing a computer program, and when the program is executed by a processor, the gradient balance-based multi-category model training method is implemented.

Embodiment 3

[0066] Based on the above model training method, Embodiment 3 of the present invention provides an electronic device, such as figure 2 As shown, the device includes a memory, a processor and a display;

[0067] The memory stores a computer program of an algorithm;

[0068] The processor is connected with the memory data, and executes the multi-category model training method based on gradient balance according to any one of claims 1-7 when calling the computer program.

[0069] The display is connected to the processor and the memory data, and displays an operation interaction interface related to the gradient balance-based multi-category model training method.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-class model training method based on gradient balance, a medium and equipment. The model training method comprises the following steps: inputting training sample data into a selected neural network model for training to obtain a loss function; performing statistics on gradient distribution of a loss function during model training; distributing sample weights according to a distribution result; performing weight smoothing processing; performing weight attenuation processing; obtaining a new gradient updating network model; according to the method, the weight of sample distribution is adjusted according to the distribution of the loss function gradient obtained by inputting the training sample data in the training process, the influence of difficult samples of different degrees on the model is balanced, the model training time is shortened, and the model precision is improved.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence image processing, in particular to a gradient balance-based multi-category model training method, medium and equipment. Background technique [0002] With the advent of the era of big data and the continuous improvement of computing power, deep learning technology driven by big data has developed rapidly. Deep learning has a wide range of applications in many application fields such as image classification, target detection (typical applications such as face recognition, pedestrian recognition, vehicle recognition, etc.), image segmentation, etc. Behind the applications in these fields are many excellent neural network structures that have emerged in recent years. [0003] It can be said: Mature application model = large amount of data + excellent network structure + suitable training method. Through appropriate training methods and a large amount of data, the network model can a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N20/00
CPCG06N20/00G06N3/045G06F18/214
Inventor 许轶博潘泽文范宏伟李佳斌
Owner 成都艾特能电气科技有限责任公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products