Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

End-to-end context-based knowledge base question and answer method and device

A technology of context and knowledge base, applied in the field of knowledge base question answering, can solve the problem of single representation of independent status relations, achieve the effect of increasing richness, easy capture, and avoiding error propagation

Active Publication Date: 2019-07-26
SOUTHEAST UNIV
View PDF9 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Purpose of the invention: In order to overcome the deficiencies in the prior art, the present invention provides an end-to-end context-based knowledge base question answering method and device, which can solve the relatively independent situation of entities and relationships in the current end-to-end model and the relationship The problem that the expression form is relatively simple

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • End-to-end context-based knowledge base question and answer method and device
  • End-to-end context-based knowledge base question and answer method and device
  • End-to-end context-based knowledge base question and answer method and device

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0047] b) Relational encoder: The phrases and words after the relational division are regarded as a sequence, and the deep neural network Bi-LSTM is used to convert them into distributed vectors, and the representation of the relational level itself directly uses the initialization vector of the vocabulary. The specific implementation is as follows: Let the matrix V∈D v×d It is a word embedding matrix, initialized randomly, and continuously updates the word representation during the training process. Where v is the number of words in the vocabulary, and d is the dimension of the word vector. First, find the representations of words with three granularities from the word embedding matrix respectively; then use the deep neural network Bi-LSTM to model the phrase and word respectively, and learn the phrase representation h(p * ) and word representation h(ω * ), select the last hidden state of the deep sequence model as the feature vector of the relationship "phrase level" ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an end-to-end context-based knowledge base question and answer method and device, and the method and device consider the connection relationship between entities and relationships in a knowledge base, enable the two tasks including entity links and relationship prediction contained in the knowledge base to be mutually promoted, and jointly improve the two tasks. The methodmainly comprises the steps of prreprocessing a problem and removing special symbols; constructing a candidate entity set related to the problem based on the knowledge base, and constructing a candidate relationship set according to the relationship of the candidate entities in the knowledge base; for each entity in each candidate set, extracting the context of the entity in the question; dividingthe candidate relationships into different granularities; predicting a subject entity and a predicate relationship based on the CERM model; and finding the object entity in the knowledge base as an answer to return by using a predicted subject entity and relationship. Entity links and relationship prediction in knowledge base questions and answers are integrated into a unified prediction model. Joint prediction of the subjective entities and relationships is achieved, and accuracy of questions and answers is improved.

Description

technical field [0001] The invention relates to a technology for automatically answering natural language questions by using a knowledge base, in particular to an end-to-end context-based knowledge base question answering method and device, belonging to the technical field of machine learning. Background technique [0002] The main task of knowledge base question answering is: Given a natural language question sentence, the computer can automatically answer the question based on the knowledge in the knowledge base. Common knowledge bases include Freebase, DBpedia, WikiData, etc. The existence of knowledge in the knowledge base is in the form of triples (S, P, O), where S represents the subject entity, O represents the object entity, and P represents the relational predicate between the subject entity and the object entity. For example, for the question "Who made the movie Woodstock Villa?", the triples stored in the knowledge base Freebase are (m.03cz5_p, movie.movie.made, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/33G06F16/332G06N3/04
CPCG06F16/3329G06F16/3344G06N3/045
Inventor 周德宇林超
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products