Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Machine reading understanding method based on a multi-head attention mechanism and dynamic iteration

A technology for reading comprehension and attention, applied in instruments, digital data processing, and special data processing applications, etc., can solve problems such as loss of semantic information, calculation of one-way attention, inability to fully integrate semantic information of articles and questions, etc., to achieve The effect of enriching semantic information

Inactive Publication Date: 2019-03-19
DALIAN UNIV OF TECH
View PDF7 Cites 87 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As an early model tested on the SQuAD dataset, Match-LSTM provides ideas for the design of subsequent excellent models, but there are also many problems: the model only calculates the one-way attention of the article word vector to the question, and loses a lot of semantics information, does not perform well when the answer is long, and the exact match results are only about 30%
BiDAF has achieved excellent results in the SQuAD 1.1 dataset evaluation, but there are still problems: there is no similar self-matching process, and the context dependencies cannot be obtained well
R-Net has achieved a performance close to that of humans in the SQuAD 1.1 dataset evaluation, but there are still some problems: (1) Both the BiDAF model and the R-Net model use the pointer network to predict the answer position at one time, and may not be able to obtain the global optimum untie
(2) R-Net does not have a mechanism similar to two-way attention flow, and cannot fully integrate the semantic information of articles and questions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Machine reading understanding method based on a multi-head attention mechanism and dynamic iteration
  • Machine reading understanding method based on a multi-head attention mechanism and dynamic iteration
  • Machine reading understanding method based on a multi-head attention mechanism and dynamic iteration

Examples

Experimental program
Comparison scheme
Effect test

experiment example

[0070] The present invention uses the SQuAD dataset to train and evaluate the model. The model uses a dropout ratio of 0.2 between character embedding, word embedding, and model layers, and optimizes the model with an optimizer AdaDelta with an initial learning rate of 1.0. The ρ and ε used by AdaDelta are 0.95 and ε respectively. 1×e -6 . The batch size of training samples is 12.

[0071] The realization of model training requires the coordination of the encoding layer, recurrent neural network layer, self-attention layer and output layer of the model, as follows:

[0072] (1) Coding layer

[0073] First, use the word segmentation tool Spacy to perform word segmentation processing for each article and question. The maximum number of words in the article is set to 400, and the maximum number of words in the question is set to 50. The samples are processed according to the set value, and the words longer than the set value are discarded. the text portion of the value, with ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a machine reading understanding method based on a multi-head attention mechanism and dynamic iteration, and belongs to the field of natural language processing. The constructionmethod of the machine reading understanding model comprises the following steps: constructing an article and problem coding layer; constructing a recurrent neural network based on bidirectional attention flow; a self-attention layer is constructed and answer output is predicted based on a dynamic iterative decoder. According to the method, answer prediction can be carried out on questions in a machine reading understanding task text; according to the invention, a new end-to-end neural network model is established, and a new idea is provided for exploring a machine reading understanding task.

Description

technical field [0001] The invention belongs to the field of machine reading comprehension, and relates to a method for encoding articles and questions, and then using a bidirectional attention flow, a self-attention layer and a dynamic iterative decoder to predict answer output. Specifically, it refers to constructing an article question encoding layer, constructing a recurrent neural network based on a bidirectional attention flow, constructing a self-attention layer, and predicting answer output based on a dynamic iterative decoder. Background technique [0002] The main form of the question-answering reading comprehension task is to give a short passage with a certain vocabulary and questions based on the short passage, and limit the answer to a text fragment in the original text. It needs to pass a certain form of logic on the basis of fully understanding the original text. Infer answers to prediction questions. At present, the mainstream models in this field mainly in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/27
CPCG06F40/289G06F40/30
Inventor 李丽双张星熠周安桥周瑜辉
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products