Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sequence labeling method based on multi-head self-attention mechanism

A technology of sequence labeling and attention, applied in neural learning methods, computer components, natural language data processing, etc.

Pending Publication Date: 2021-02-19
STATE GRID TIANJIN ELECTRIC POWER +1
View PDF5 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to provide a sequence labeling method based on a multi-head self-attention mechanism for the problems of local dependence and serialization coding in the sequence labeling method in the prior art.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sequence labeling method based on multi-head self-attention mechanism
  • Sequence labeling method based on multi-head self-attention mechanism
  • Sequence labeling method based on multi-head self-attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0070] The present invention first uses a bidirectional long-short-term memory unit (BLSTM) to learn contextual semantic features of words in a text. Then, based on the hidden representation learned by BLSTM, a multi-head self-attention mechanism is used to model the semantic relationship between any two words in the text, and then the global semantics that each word should pay attention to is obtained. In order to fully consider the complementarity of local context semantics and global semantics, the present invention designs three feature fusion methods to fuse the two parts of semantics, and uses the conditional random field model (CRF) to predict the label sequence based on the fused features.

Embodiment 2

[0072] The present invention mainly uses deep learning technology and natural language processing related theoretical methods to realize the sequence labeling task. In order to ensure the normal operation of the system, in the specific implementation, the computer platform used is required to be equipped with no less than 8G of memory, and the number of CPU cores Not less than 4 and the main frequency is not lower than 2.6GHz, GPU environment, Linux operating system, and the necessary software environment such as Python3.6 and above, pytorch0.4 and above.

[0073] Such as figure 1 As shown, the sequence labeling method based on the multi-head self-attention mechanism provided by the present invention mainly includes the following steps performed in order:

[0074] Step 1, Local Context Semantic Encoding: Use Bidirectional Long Short-Term Memory (BLSTM) to sequentially learn the local context semantic representation of words in the text.

[0075] Step 1.1) Use the Stanford NLP...

Embodiment 3

[0094] The sequence labeling method based on the multi-head self-attention mechanism mainly includes the following steps performed in order:

[0095] Step 1, Local Context Semantic Encoding: Use Bidirectional Long Short-Term Memory (BLSTM) to sequentially learn the local context semantic representation of words in the text.

[0096] Step 1.1, use the Stanford NLP toolkit to segment the input text to obtain the corresponding word sequence X={x 1 ,x 2 ,...,x N}.

[0097]For example, given the text "I participated in a marathon in Tianjin yesterday", after word segmentation, the word sequence {"I", "yesterday", "in", "Tianjin", "participated", "了", "one", "marathon", "race"}.

[0098] Step 1.2, considering that the words in the text usually contain rich morphological features, such as prefix and suffix information, so this step is for each word in the word sequence Encode each word x using a bidirectional LSTM (BLSTM) structure i Corresponding character-level vector repres...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sequence labeling method based on a multi-head self-attention mechanism, which comprises the following steps: step 1, local context semantic coding: learning local context semantic representation of words in a text by using BLSTM serialization, step 2, global semantic coding: based on the local context semantic representation of the words coded in the first step, coding global semantic representation of the words through a multi-head self-attention mechanism; step 3, semantic feature fusion: fusing the local context semantic representation encoded in the step 1 and the global semantic representation encoded in the step 2, and taking a fusion result as the input semantic feature of the step 4; step 4, sequence labeling: in order to fully consider the dependency relationship between labels in a sequence labeling task, utilizing CRF to predict the labels; step 5, performing model training; step 6, performing model reasoning. On the basis of the recurrent neural network, a multi-head self-attention mechanism is further introduced to learn global semantic representation of words, and therefore the sequence labeling effect is improved.

Description

technical field [0001] The invention relates to the technical field of computer applications, in particular to a sequence labeling method based on a multi-head self-attention mechanism. Background technique [0002] Sequence labeling is an important research topic in natural language processing tasks. Its goal is to predict the corresponding label sequence based on a given text sequence, mainly including named entity recognition (Named Entity Recognition, NER), chunk analysis (Text Chunking), Tasks such as Part-Of-Speech (POS) and opinion extraction (OpinionExtraction). [0003] Most of the early sequence labeling methods are based on rules, which require the establishment of rule templates and a large amount of expert knowledge, which consume a lot of manpower and material resources, and are not easy to expand and transplant to other fields. For example, Wang Ning and others used a rule-based method to artificially establish a knowledge base for financial company name reco...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/295G06F40/30G06F40/126G06F16/35G06K9/62G06N3/04G06N3/08
CPCG06F40/295G06F40/30G06F40/126G06F16/35G06N3/084G06N3/049G06N3/045G06F18/253
Inventor 孟洁李妍刘晨张倩宜王梓蒴单晓怡李慕轩王林刘赫董雅茹
Owner STATE GRID TIANJIN ELECTRIC POWER
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products