Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Classifier, neural network model training method, data processing device, and medium

A neural network model and training method technology, applied in the field of classification, can solve the problem of low recognition accuracy of the classifier, and achieve the effect of improving the recognition accuracy

Inactive Publication Date: 2019-06-18
BEIJING QIYI CENTURY SCI & TECH CO LTD
View PDF0 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve one of the above problems, the present invention provides a classifier, a neural network model training method, data processing equipment and media, in order to overcome the problem of low recognition accuracy of the classifier, thereby improving its recognition accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Classifier, neural network model training method, data processing device, and medium
  • Classifier, neural network model training method, data processing device, and medium
  • Classifier, neural network model training method, data processing device, and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. It should be noted here that the numbers, serial numbers and reference signs in the application are only for the convenience of description, and do not constitute any restrictions on the steps, sequence, etc. of the present invention, unless the instructions clearly indicate the steps There is a specific sequence of execution.

[0031] The most common method for the neural network classification model to solve multi-classification problems is...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a classifier, a neural network model training method, data processing equipment and a medium. A classifier comprises: a weighted loss value calculation means for obtaining a weighted loss value of a training sample on each classification category by means of a prediction confidence output from a neural network model; a Parameter adjusting device, wherein the weighted loss value is reversely output to the parameter adjusting device; Therefore, the parameters of the neural network model are adjusted based on the weighted loss value, and the traininig of the neural networkmodel of the classifer can be completed. wherein the weighted loss value calculation device is used for adding a weighting factor into a loss function used for measuring the prediction confidence coefficient of a training sample and the loss cost marked with a true value, so that the weighted loss value of the training sample on each classification category is obtained through calculation. By adding a weighting factor during loss value calculation, parameters of the neural network model can be adjusted, the network model can be optimized, and the recognition precision of the classifier can beimproved.

Description

technical field [0001] The invention relates to classification technology, in particular to a classifier, a neural network model training method, data processing equipment and media. Background technique [0002] Deep learning is currently the mainstream method with excellent accuracy and widely used in the field of computer pattern recognition. In the application process of the deep learning model, it is first necessary to train the deep learning model on a large training data set, through a custom loss function and a backpropagation algorithm (such as the gradient descent of the backpropagation algorithm for multiple iterations) function) to learn iteratively optimal neural network model parameters, and then deploy it to practical applications for classification or prediction after verifying that the target is reached on the verification set. [0003] In the actual data training of deep learning, the problem of low classification accuracy often occurs due to the unbalance...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
Inventor 蔡东阳王涛刘倩刘洁
Owner BEIJING QIYI CENTURY SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products