Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Entity and relation joint learning method based on attention model

A technology of attention model and learning method, applied in the field of entity and relation joint learning based on concentrated attention model, which can solve the problems of entity recognition module affecting classification performance, error propagation, weak context representation performance, etc.

Pending Publication Date: 2019-12-13
EAST CHINA UNIV OF SCI & TECH
View PDF7 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The shortcomings of the pipeline method are: 1) Error propagation, the error of the entity recognition module will affect the performance of the following relationship classification; 2) The relationship between the two subtasks is ignored
Parameter sharing refers to the joint training of the named entity recognition model and the relationship classification model through the shared layer. The selection of the shared layer is very important. Existing methods generally use word embedding+BiLSTM network, but recent research shows that the existing BiLSTM network can get word The performance of the context representation is weaker than that of the BERT language model; the joint labeling strategy refers to the use of the extended labeling strategy to complete the two tasks of entity recognition and relationship extraction at the same time
The joint labeling strategy method needs to change the original habits of labelers and increase learning costs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Entity and relation joint learning method based on attention model
  • Entity and relation joint learning method based on attention model
  • Entity and relation joint learning method based on attention model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] In order to make the technical content disclosed in this application more detailed and complete, reference may be made to the drawings and the following specific embodiments of the present invention, and the same symbols in the drawings represent the same or similar components. However, those skilled in the art should understand that the examples provided below are not intended to limit the scope of the present invention. In addition, the drawings are only for schematic illustration and are not drawn according to their original scale.

[0052] Please refer to figure 1 , figure 1 A schematic flowchart of the entity-relationship joint learning method based on the concentrated attention model provided by the embodiment of the present application, as shown in figure 1 As shown, an entity-relationship joint learning method based on the concentrated attention model provided by the embodiment of the present application may include the following steps:

[0053] A1, add [CLS]...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides an entity and relation joint learning method based on an attention model, which is characterized by the following steps: first, inputting a clinical text sequence into an Embedding layer in the form of [CLS] Sequence [SEP] to obtain the initial vector representation H0 of each word; then, inputting the H0 into the first N-K layer of a multi-head self-attention mechanism, and outputting the context representation information Hm of each word; then, inputting the Hm into the remaining K layers to obtain a word vector representation H<m><task> corresponding to an entity recognition and relation extraction task; finally, inputting the H<m><task>, learning by a matrix MASKtask, entity recognition, or relationship classification downstream task layer, and then outputting entity and relation information. Experimental results show that the method of the present invention is significantly superior to other methods in entity recognition, relation extraction, and joint learning, indicates the effectiveness of the method.

Description

technical field [0001] The present invention relates to the technical field of entity-relationship joint learning for clinical texts, and more specifically, relates to an entity-relationship joint learning method based on a concentrated attention model. Background technique [0002] In recent years, with the widespread application of electronic medical records, a large amount of electronic medical record data can be integrated and shared in different medical environments, and provide data support for doctors' clinical decision-making and government health policy formulation. However, most of the information in the current electronic medical records is stored in the form of natural language, and the existing data mining algorithms cannot directly process and process these data. In order to structure the text of electronic medical records into data that can be processed by algorithms, entity recognition and relationship extraction algorithms are used to extract entity-relation...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G16H15/00G06F17/27
CPCG16H15/00Y02A90/10
Inventor 翟洁薛魁张欢欢叶琪阮彤周扬名马致远
Owner EAST CHINA UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products