A neural network question generation method based on answers and answer position information

A technology of location information and neural network, applied in the field of question generation task in the research of neural network question answering system

Inactive Publication Date: 2019-04-26
中科国力(镇江)智能技术有限公司
View PDF3 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0018] The technical problem to be solved by the present invention: Aiming at the above two problems existing in the existing neural network model-based question generation method, the present invention provides a sequence-to-sequence neural network training model that incorporates the attention mechanism. This model focuses on adding features such as the answer and its location information in the original text, so as to achieve a higher correct rate of question words and the average accuracy and average recall rate of copying unregistered words

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A neural network question generation method based on answers and answer position information
  • A neural network question generation method based on answers and answer position information
  • A neural network question generation method based on answers and answer position information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] To explain the present invention more clearly, the following symbols are defined and explained:

[0056] (1) Represents the input feature vector set, the feature vector set w of each word i ∈R dw +da+dn+dp , where T x is the length of the input text, d w , d a , d n , d p They are the word vector, the position information vector of the answer in the original text, the named entity vector, and the dimensions of the part-of-speech vector, i∈[1, T x ].

[0057] (2) Represents the hidden layer state sequence in the encoder neural network model, and each hidden layer state is a cascade represented by the forward and reverse LSTM, respectively expressed as and every h i is a 512-dimensional vector, i∈[1, T x ].

[0058] (3) At each step t, the context vector independent of the position of the answer in the original text is denoted as c t , the context vector related to the position of the answer in the original text is denoted as c’ t .

[0059] (4) The a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a neural network question generation method based on answers and answer position information, and a neural network model is from a sequence to a sequence and consists of an encoder and a decoder. Wherein the encoder model mainly learns characteristics in an original text, and the decoder is used for generating a question sentence corresponding to an answer by using the characteristics. The complete generation process of the question sentence is divided into three modes, namely a question word generation mode, a dictionary word generation mode and a copy mode. Wherein the question word generation mode uses an answer-based model, participates in model calculation through context vectors irrelevant to position information of answers in an original text, and generates question words corresponding to the answers from a limited question word dictionary; Wherein the question word generation mode adopts context vectors related to position information of answers in an original text for calculation; The probability distribution of the copy mode directly uses an attention distribution related to location information of the answer in the original text.

Description

technical field [0001] The invention relates to the fields of English natural language processing, question answering system and machine learning, in particular to the question generation task in the research of deep learning neural network question answering system. Background technique [0002] The task of question generation in question answering system research is to generate related questions that can correspond to answers based on a given piece of content text and an answer text related to the content. [0003] In recent years, research methods on question generation tasks are mainly divided into two categories: rule-based and neural network-based methods. Compared with rule-based methods, neural network-based question generation methods are more data-driven, support end-to-end training, and do not rely on hand-written rules. [0004] However, there are still two problems in the existing research on question generation based on neural network models: [0005] 1. The ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/332G06F16/35G06F17/27G06N3/04
CPCG06F40/242G06N3/044G06N3/045
Inventor 王石资康莉符建辉王卫民曹存根
Owner 中科国力(镇江)智能技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products