Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Text abstract generation system and method based on adversarial learning and hierarchical neural network

A neural network and generation system technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as too long input sequences, narrowing distances, and accuracy of key information capture summaries, so as to improve attention, Refine the granularity and accurately capture the effect of key words

Active Publication Date: 2021-03-09
CHONGQING UNIV OF POSTS & TELECOMM
View PDF11 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For the traditional seq2seq model based on the attention mechanism, the input sequence is too long to memorize the context well, resulting in the lack of text understanding and the coarser granularity of the attention mechanism, which leads to inaccurate capture of key information and low summary accuracy. For the problem, an idea of ​​layered encoding of neural network is proposed, which is divided into word embedding level and sentence embedding level, and the enhanced memory mechanism is introduced at each level. The benefit is to reduce the error in backpropagation derivation and refine the The attention granularity of the attention mechanism of the traditional seq2seq model enables it to more accurately capture the key information in the article. At the same time, it introduces adversarial learning when decoding, and sets up a recognizer to identify the standard representation and the fuzzy representation, reducing the gap between the two. At the same time, supervised learning prevents them from approaching, forming a confrontation, and finding the optimal generation result when the confrontation is balanced, improving the accuracy of text summary generation, thereby improving the accuracy of the final generated summary

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text abstract generation system and method based on adversarial learning and hierarchical neural network
  • Text abstract generation system and method based on adversarial learning and hierarchical neural network
  • Text abstract generation system and method based on adversarial learning and hierarchical neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The technical solutions in the embodiments of the present invention will be described clearly and in detail below with reference to the drawings in the embodiments of the present invention. The described embodiments are only some of the embodiments of the invention.

[0037] The technical scheme that the present invention solves the problems of the technologies described above is:

[0038] It should be noted that natural language processing is an important direction in the field of computer science and artificial intelligence. It mainly involves various theories and methods to realize effective communication between humans and computers using natural language. Obtained from Weibo or WeChat The method for text data may use a crawler method, or other software programs for obtaining data, which is not specifically limited in this embodiment of the present invention.

[0039] figure 1 It is a schematic diagram of the system module structure of the present invention. Acco...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention requests to protect a text abstract generation system and method based on adversarial learning and a hierarchical neural network, and belongs to the field of text abstracts of natural language processing. The system comprises a discriminator module, a preprocessing module, a word embedding module, a sentence embedding module, a generation module and an adversarial learning module. According to the invention, on the basis of an encoder decoder model (Seq2Seq), a new hierarchical division model is provided. An encoder part of the Seq2Seq is divided into a word embedding layer and asentence embedding layer, and an enhanced memory mechanism is introduced into each layer, so that the model can better understand text meanings, adversarial learning is introduced during decoding, arecognizer is arranged to recognize standard representation and fuzzy representation, the distance between the standard representation and the fuzzy representation is shortened, and meanwhile, learning is supervised to prevent the standard representation and the fuzzy representation from approaching, confrontation is formed, and when confrontation is balanced, an optimal generation result is found, so that the text abstract generation accuracy is improved.

Description

technical field [0001] The invention belongs to the field of text summarization of natural language processing, in particular to a text summarization method and system based on adversarial learning and layered neural network. Background technique [0002] With the explosive growth of Internet text information in recent years, people are exposed to massive amounts of text information every day, such as news, blogs, chats, reports, papers, Weibo, etc. Extracting important content from a large amount of text information has become an urgent need, and automatic text summarization provides an efficient solution. [0003] The traditional attention mechanism-based encoder-decoder model first encodes the words of the text, then adds the attention mechanism to learn the key information of the article, and then decodes the word encoding to generate a text summary. The granularity of the attention mechanism of this type of method is relatively coarse, and the learning of long texts ca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/34G06F40/126G06F40/211G06F40/284G06F40/295G06K9/62G06N3/04G06N3/08
CPCG06F16/345G06F40/126G06F40/211G06F40/284G06F40/295G06N3/084G06N3/047G06N3/045G06F18/2415
Inventor 黄海辉查茂鸿常光辉胡诗洋
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products