Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Named entity recognition method and device based on hybrid lattice self-attention network

A technology of named entity recognition and entity recognition, applied in biological neural network models, natural language data processing, instruments, etc., can solve the problem of ignoring the role of global lexical information, and word feature fusion methods that do not take into account the difference in semantic expression of word vectors. , can not effectively enhance the word-level features of word vectors, etc., to achieve the effect of improving performance

Pending Publication Date: 2022-05-03
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the existing methods have achieved good results in the fusion of word feature vectors, there are problems in the existing technical means: 1) The word feature fusion method does not take into account the semantic expression of word vectors trained by different models. 2) In the vocabulary enhancement method based on learning word weights, only the matching word pairs of each word feature are considered. The impact of semantic representation, ignoring the role of global lexical information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Named entity recognition method and device based on hybrid lattice self-attention network
  • Named entity recognition method and device based on hybrid lattice self-attention network
  • Named entity recognition method and device based on hybrid lattice self-attention network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention is described in further detail now in conjunction with accompanying drawing.

[0063] It should be noted that terms such as "upper", "lower", "left", "right", "front", and "rear" quoted in the invention are only for clarity of description, not for Limiting the practicable scope of the present invention, and the change or adjustment of the relative relationship shall also be regarded as the practicable scope of the present invention without substantive changes in the technical content.

[0064] The present invention refers to a named entity recognition method based on a mixed lattice self-attention network, which includes the following steps:

[0065] S1, look up the words consisting of consecutive words in the input sentence in the dictionary, merge them into a single multidimensional vector through positional alternation mapping, and encode the sentence feature vector represented by the word pair into a single dimension by using mixed word and wor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a named entity recognition method based on a mixed lattice self-attention network, and the method comprises the steps: S1, encoding a sentence feature vector represented by a word pair into a matrix with a fixed dimension, and obtaining a word vector representation of a mixed lattice structure; constructing a self-attention network to capture the influence of a word vector on a word vector in the vector, and enhancing the feature representation of each word vector; word features are fused in an Embedding layer of BERT, and better word vector representation is obtained through learning by fine tuning of a learning process; an entity sequence labeling task and a decoding process in entity recognition are realized according to a BiLSTM-CRF network, modeling of fused character features is completed through the network, and an entity recognition model based on the hybrid lattice self-attention network is constructed. According to the method, global vocabulary information can be captured, semantic-rich word vector representation is generated, and the Chinese named entity recognition precision is improved on multiple data sets.

Description

technical field [0001] The invention relates to the technical field of natural language processing in artificial intelligence, in particular to a named entity recognition method and device based on a mixed lattice self-attention network. Background technique [0002] Named Entity Recognition (NER) is also called entity extraction. It was first proposed at the MUC-6 conference. It is a technology for extracting entities from text in information extraction technology. Early entity recognition used rule-based and statistics-based methods. Because these traditional methods rely too much on manual design, and the recognition coverage is small and the recognition accuracy is low, they have long been replaced by deep learning methods. In the method based on deep learning, the entity recognition model is divided into a character-based model (character-based) and a word-based model (word-based), and some other languages ​​such as English usually use a character-based model, because e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/295G06F40/279G06F40/242G06N3/04
CPCG06F40/295G06F40/279G06F40/242G06N3/044
Inventor 王立松何宗锋刘绍翰刘亮
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products