Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model training method, customer service system, data labeling system and readable storage medium

A technology for model training and training data sets, applied in the field of data processing, can solve the problems of high acquisition cost, low quality of classification models, and poor response quality of customer service systems, so as to improve classification accuracy and quality, reduce costs and workload, and increase The effect of quality and data

Active Publication Date: 2019-04-05
WEBANK (CHINA)
View PDF2 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The main purpose of the present invention is to provide a model training method, a customer service system, a data labeling system, and a readable storage medium, aiming to solve the problems of high acquisition cost of label data required by existing classification models and low quality of classification models, which in turn lead to problems in customer service. Technical issues with poor system response

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method, customer service system, data labeling system and readable storage medium
  • Model training method, customer service system, data labeling system and readable storage medium
  • Model training method, customer service system, data labeling system and readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0057] The technical idea of ​​the present invention is briefly described here.

[0058] When training a classification model, the required training data set generally needs to use a data set composed of labeled data. Express the training data set as Dk={(x 1 ,y 1 ),(x 2 ,y 2 ),…,(x k ,y k )}, the above k samples have been labeled, and the corresponding category is marked as known, which is the labeled sample in the labeled data set in this embodiment; in addition, there is also a data set Du={(x k+1 ,y k+1 ),(x k+2 ,y k+2 ),…,(x k+u ,y k+u )}, k<u, the above u samples are not labeled, and the corresponding category is marked as unknown, which is the unlabeled sample in the unlabeled data set in this embodiment.

[0059] If the traditional supervised learning method is adopted, only D k Can be used for th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a model training method which comprises the following steps: taking the same annotated data set as a training data set of different classification models, and training the different classification models; Based on the trained classification model, predicting prediction samples in the unlabeled data set to obtain a prediction result; obtaining Confidence of a prediction result, and marking the prediction sample as a high-confidence prediction sample or a low-confidence prediction sample; Adding prediction samples marked as high-confidence prediction samples in each classification model to training data sets of other classification models; And based on the new training data set, performing preset rounds of iterative training, and obtaining a classification model aftermultiple times of iterative training. The invention further provides a customer service system, a data labeling system and a readable storage medium. The technical problem that the response quality ofa customer service system is poor due to the fact that the obtaining cost of the mark data needed by an existing classification model is high and the quality of the classification model is low is solved.

Description

technical field [0001] The invention relates to the field of data processing, in particular to a model training method, a customer service system, a data labeling system, and a readable storage medium. Background technique [0002] In the existing customer service system, many modules need to use different classification models. The modules that need to use the classification model include: question classification, question ambiguity judgment, sentiment analysis, etc. There are many commonly used classification models, from logistic regression to SVM (Support Vector Machine, support vector machine), XGBoost (eXtreme Gradient Boosting, extreme gradient boosting), FastText (shallow network), to LSTM in deep learning ( Long Short-Term Memory, that is, long-term short-term memory network), CNN (Convolutional Neural Network, that is, convolutional neural network), RNN ((Recurrent Neural Network, that is, cyclic neural network) are widely used in various classification tasks. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/35G06F16/332
Inventor 黎洛晨郑德荣杨海军徐倩杨强
Owner WEBANK (CHINA)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products