Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A text classification method and system for lifelong learning

A text classification, lifelong technology, applied in the field of text classification methods and systems for lifelong learning, to achieve the effect of improving accuracy and saving computing costs

Active Publication Date: 2022-07-12
FOSHAN UNIVERSITY
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to propose a text classification method and system for lifelong learning, to solve one or more technical problems in the prior art, and at least provide a beneficial choice or create conditions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A text classification method and system for lifelong learning
  • A text classification method and system for lifelong learning
  • A text classification method and system for lifelong learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0078] The concept, specific structure and technical effects of the present invention will be clearly and completely described below with reference to the embodiments and the accompanying drawings, so as to fully understand the purpose, solutions and effects of the present invention. It should be noted that the embodiments in the present application and the features of the embodiments may be combined with each other in the case of no conflict.

[0079] In the description of the present invention, the meaning of several is one or more, the meaning of multiple is two or more, greater than, less than, exceeding, etc. are understood as not including this number, above, below, within, etc. are understood as including this number. If it is described that the first and the second are only for the purpose of distinguishing technical features, it cannot be understood as indicating or implying relative importance, or indicating the number of the indicated technical features or the order ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a text classification method and system for lifelong learning. The pre-trained language model is used as a text classification model, and a task sequence is used to train the text classification model to obtain a lifelong learning model; storage memory is set; Replay frequency and storage rate, fine-tune the life-long learning model by using sparse experience replay, fine-tune the life-long learning model again, and then use the life-long learning model to predict the output of the input text, which achieves: First, reduce the time complexity This saves training time, and at the same time, improves text classification accuracy and alleviates the beneficial effects of catastrophic forgetting.

Description

technical field [0001] The invention belongs to the technical field of text data processing and character processing, and in particular relates to a text classification method and system for lifelong learning. Background technique [0002] Lifelong learning is a continuous learning process, and its research fields include lifelong supervised learning, lifelong reinforcement learning, and open learning. This paper mainly studies in the field of lifelong supervised learning. When the model is trained on new task data after completing a series of N supervised learning tasks, the neural network usually forgets the knowledge learned in the previous task, which means the weight of the new task. It is possible to overwrite the weights of the previous task, thereby reducing the performance of the model of the previous task and leading to catastrophic forgetting. After learning the first N categories of knowledge, the lifelong learning text classification model has the ability to co...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/35G06N3/04G06N3/08
CPCG06F16/353G06F16/355G06N3/08G06N3/047G06N3/045
Inventor 孔蕾蕾彭泽阳齐浩亮韩咏韩中元
Owner FOSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products