Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Text abstraction method based on hierarchical interaction attention

A technology of attention and summarization, applied in the field of natural language processing, can solve the problem of ignoring detailed features such as word-level structure, and achieve the effect of improving generation quality, improving generation quality, reducing redundancy and noise

Active Publication Date: 2019-11-19
KUNMING UNIV OF SCI & TECH
View PDF15 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional attention mechanism-based encoding and decoding models usually only consider the high-level semantic information of the encoder as the semantic representation of the context, while ignoring the detailed features such as the word-level structure obtained by the low-level neural network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text abstraction method based on hierarchical interaction attention
  • Text abstraction method based on hierarchical interaction attention
  • Text abstraction method based on hierarchical interaction attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] Example 1: Such as Figure 1-Figure 4 As shown, the text summarization method based on hierarchical interactive attention, the specific steps of the text summarization method based on hierarchical interactive attention are as follows:

[0033] Step1. Use the English data set Gigaword as the training set, and preprocess the data set with a preprocessing script to obtain a training set and development set of 3.8 million and 189,000 respectively. Each training sample contains a pair of input text and summary sentences;

[0034] Step2: The encoder uses two-way LSTM to encode the training set, and the number of layers is set to three;

[0035] Step3: The decoder adopts a one-way LSTM network, input the sentence to be decoded to calculate the context vector of each layer;

[0036] Step4. For the multi-layer codec model, the codec contains multi-layer LSTMs. In each layer of LSTM, the hidden state representation between the upper layer and the current layer is calculated, so as to fuse...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a text abstraction method based on hierarchical interaction attention, and belongs to the technical field of natural language processing. According to the method, the generation of the abstract is guided by extracting the feature information of different levels of the encoder through the level interaction attention. Meanwhile, in order to avoid the problem of information redundancy caused by introduction of different levels of features, data noise is compressed by utilizing a variational information bottleneck. Aiming at the generative text abstract, under an attention-based encoding and decoding framework, the multi-layer context information of the encoder is extracted through an attention mechanism to guide the decoding process, and meanwhile, the variational information bottleneck is introduced to constrain the information, so that the quality of the generative text abstract is improved. Experimental results show that the method can significantly improve theperformance of the encoding and decoding framework in a generative abstract task.

Description

Technical field [0001] The invention relates to a text summarization method based on hierarchical interactive attention, and belongs to the technical field of natural language processing. Background technique [0002] With the development of deep learning technology, generative text summarization methods have become a hot research topic. Traditional encoding and decoding models based on the attention mechanism usually only consider the high-level semantic information of the encoder as the semantic representation of the context, and ignore the detailed features such as the word-level structure obtained by the low-level neural network. The present invention proposes a multi-layer feature extraction and fusion method based on a hierarchical interactive attention mechanism to obtain features of different levels of the encoder, and at the same time introduce a variational information bottleneck at the decoding end to compress and denoise the fused information, thereby generating highe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/27G06F16/34G06F16/35G06N3/04G06N3/08
CPCG06F16/345G06F16/35G06N3/08G06N3/044G06N3/045
Inventor 余正涛周高峰黄于欣高盛祥郭军军王振晗
Owner KUNMING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products