Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A meta-knowledge fine-tuning method and platform based on domain-invariant features

A technology of knowledge and real domain, applied in the field of meta-knowledge fine-tuning method and platform based on domain-invariant features, which can solve problems such as limited effect of compression model

Active Publication Date: 2021-04-16
ZHEJIANG LAB
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the fine-tuning stage, the existing compression method of the pre-trained language model for downstream tasks is fine-tuned on the specific data set of the downstream task, and the effect of the compression model obtained by training is limited by the specific data set of this type of task

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A meta-knowledge fine-tuning method and platform based on domain-invariant features
  • A meta-knowledge fine-tuning method and platform based on domain-invariant features
  • A meta-knowledge fine-tuning method and platform based on domain-invariant features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The invention discloses a meta-knowledge fine-tuning method and platform of a general language model based on domain-invariant features on the basis of a general compression framework of a pre-trained language model. The fine-tuning method of the pre-trained language model for downstream tasks is to perform fine-tuning on the cross-domain data sets of downstream tasks, and the effect of the obtained compression model is suitable for data scenarios of different domains of similar tasks.

[0034] Such as figure 1 As shown, the present invention designs a meta-knowledge fine-tuning learning method: a learning method based on domain-invariant features. The present invention learns highly transferable shared knowledge, ie, domain-invariant features, on different datasets for similar tasks. Introduce domain-invariant features, fine-tune the common domain features on different domains corresponding to different data sets of similar tasks learned by the network, and quickly a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a meta-knowledge fine-tuning method and platform based on domain-invariant features. The method learns highly transferable shared knowledge on different data sets of similar tasks, that is, domain-invariant features, and the fine-tuning network concentrates on learning similar The common domain features on different domains corresponding to different data sets of tasks can quickly adapt to any different domains. The invention improves the parameter initialization ability and generalization ability of the general language model of similar tasks, and finally fine-tunes to obtain the general compression framework of the language model of similar downstream tasks. In the meta-knowledge fine-tuning network, the present invention designs a domain-invariant feature loss function to learn domain-independent general knowledge, that is, to minimize a domain-invariant feature learning objective to drive the language model to have domain-invariant features coding ability.

Description

technical field [0001] The invention belongs to the field of language model compression, in particular to a meta-knowledge fine-tuning method and platform based on domain-invariant features. Background technique [0002] Pretrained neural language models improve the performance of a variety of natural language processing tasks by fine-tuning on task-specific training sets. In the fine-tuning stage, the existing compression method of the pre-trained language model for downstream tasks is fine-tuned on the specific data set of the downstream task, and the effect of the trained compression model is limited to the specific data set of this type of task. Contents of the invention [0003] The purpose of the present invention is to provide a meta-knowledge fine-tuning method and platform based on domain-invariant features to address the deficiencies of the prior art. The present invention introduces meta-knowledge based on domain-invariant features, and learns common domain fea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08G06N5/04G06N20/00
CPCG06N3/08G06N20/00G06N5/041G06N3/045G06F18/2414
Inventor 王宏升单海军梁元邱启仓
Owner ZHEJIANG LAB
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products