Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Natural language relation extraction method based on multi-task learning mechanism

A technology of multi-task learning and relation extraction, applied in the field of natural language relation extraction based on multi-task learning mechanism, which can solve the problems of inability to effectively deal with long-distance dependencies of sequences, accumulated errors, and obstacles to the effect of relation extraction tasks.

Active Publication Date: 2020-06-05
EAST CHINA NORMAL UNIV
View PDF9 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The method based on the convolutional neural network and its variants can effectively and automatically extract features, but it cannot accurately characterize the time series problem; the method based on the recurrent neural network and its variants can automatically capture the sequence in time. dependencies, but cannot effectively deal with long-distance dependencies in sequences
In addition, these deep neural network-based methods often need to rely on additional high-level natural language processing tools to obtain lexical, syntactic, and semantic information, and these additional processing steps will accumulate errors.
These problems have hindered the further improvement of the relationship extraction task, so there is an urgent need to establish a more effective relationship extraction model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Natural language relation extraction method based on multi-task learning mechanism
  • Natural language relation extraction method based on multi-task learning mechanism
  • Natural language relation extraction method based on multi-task learning mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The present invention will be further described in detail in conjunction with the following specific embodiments and accompanying drawings. The process, conditions, experimental methods, etc. for implementing the present invention, except for the content specifically mentioned below, are common knowledge and common knowledge in this field, and the present invention has no special limitation content.

[0022] The present invention proposes a method for extracting natural language relations based on a multi-task learning mechanism, which is specifically divided into three parts, such as figure 2 Shown:

[0023] Input layer: mainly used to process input data. The input layer is similar to the input layer of the single-task model. It also first divides the sentence or sentence pair through WordPiece to obtain the subword sequence. However, unlike the single-task model, in order to avoid the problem of unbalanced multi-task data set size, the training samples of each auxi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a natural language relation extraction method based on a multi-task learning mechanism. The method comprises the following steps: introducing information implied by different tasks by utilizing a plurality of auxiliary tasks to improve a relation extraction effect; introducing knowledge distillation to enhance the effect of assisting a task to guide and train a multi-task model, introducing a teacher annealing algorithm for relation and extraction based on multi-task learning to enable the effect of the multi-task model to serve as a single-task model of a guide task inan ultra-far mode, and finally improving the accuracy of relation extraction is improved. The method comprises the following steps: firstly, training on different auxiliary tasks to obtain a multi-task model for guiding training; then, a model learned by an auxiliary task and a real label are used as supervision information to guide the learning of a multi-task model at the same time; finally, evaluation is carried out on a SemEval2010 task-8 data set, and the performance of the model is superior to that of a model independently using improved BERT for relation extraction and is also superiorto that of a mainstream model based on deep learning relation extraction.

Description

technical field [0001] The invention relates to a relation extraction technology in natural language processing, in particular to a natural language relation extraction method based on a multi-task learning mechanism. Background technique [0002] With the advent of the era of big data and artificial intelligence, various types of information on the Internet are growing at an explosive rate, and how to obtain information from massive amounts of data has become an urgent problem to be solved. Information extraction is an important branch of natural language understanding. Its main goal is to automatically extract the unstructured information embedded in the text and then convert it into structured data. The related technical research is gradually emerging. Relation extraction, as an important subtask of information extraction, its main goal is to extract an entity-relationship triple relation(e1,e2) for a specific entity pair given a piece of text. Relational extraction, as ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35G06F40/289G06N3/04G06N3/08
CPCG06F16/355G06N3/08G06N3/045
Inventor 胡文心王伟杰杨静
Owner EAST CHINA NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products