Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Text classification model tuning hyper-parameter recommendation method and device and storage medium

A text classification and recommendation method technology, applied in the field of machine learning, can solve problems such as low efficiency and difficult hyperparameter combinations, and achieve the effect of improving efficiency

Pending Publication Date: 2021-08-06
SOUTH CHINA NORMAL UNIVERSITY
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In deep learning text classification, due to the large variety of hyperparameters, the efficiency of manually adjusting hyperparameters is usually relatively low, and it is difficult to quickly, conveniently and accurately obtain the optimal combination of hyperparameters

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text classification model tuning hyper-parameter recommendation method and device and storage medium
  • Text classification model tuning hyper-parameter recommendation method and device and storage medium
  • Text classification model tuning hyper-parameter recommendation method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0065] In the description of the present invention, it should be understood that the orientation descriptions, such as up, down, front, back, left, right, etc. indicated orientations or positional relationships are based on the orientations or positional relationships shown in the drawings, and are only In order to facilitate the description of the present invention and simplify the description, it does not indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a text classification model tuning hyper-parameter recommendation method and device and a storage medium, and the method comprises the steps: constructing a hyper-parameter set according to the hyper-parameter type of a text classification model; according to the category system and the classification performance index of the text classification model, obtaining a first group of data through calculation, wherein the first group of data comprises category system weight information and overall classification performance index weight information; according to the hyper-parameter set, training and testing the text classification model to obtain a second group of data, the second group of data comprising an overall classification performance result and a category classification performance result set; according to the first group of data, calculating the second group of data to obtain a third group of data, wherein the third group of data comprises an overall classification performance comprehensive result and a category classification performance comprehensive result; and sorting the third group of data to obtain a recommended hyper-parameter group. According to the invention, the efficiency of deep learning text classification model tuning can be improved; The method and device can be widely applied to the field of machine learning.

Description

technical field [0001] The invention relates to the field of machine learning, in particular to a hyperparameter recommendation method, device and storage medium for text classification model tuning. Background technique [0002] Text classification is an important task in the field of machine learning, especially in natural language processing. The performance of deep learning text classification models is usually closely related to the selection of hyperparameters. The selection of hyperparameters is generally determined manually before model training. After the model is trained and tested, the hyperparameters are adjusted according to the performance results of the test, and the model is trained again. The final hyperparameters are determined by comparing the results before and after hyperparameter adjustments. combination. In deep learning text classification, due to the large variety of hyperparameters, the efficiency of manually adjusting hyperparameters is usually re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/9535G06F16/35G06N3/04G06N3/08
CPCG06F16/9535G06F16/35G06N3/08G06N3/045
Inventor 郝天永雷顺威瞿瑛瑛
Owner SOUTH CHINA NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products