Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A generative dialogue summarization method incorporating common sense knowledge

A generative, knowledge-based technology, applied in biological neural network models, instruments, computing, etc., to solve problems such as low abstraction and inaccurate dialogue summaries

Active Publication Date: 2022-07-01
HARBIN INST OF TECH
View PDF15 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention aims to solve the problem that the existing generated dialogue summarization method does not use common sense knowledge, resulting in inaccurate and low abstraction of the generated dialogue summaries

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A generative dialogue summarization method incorporating common sense knowledge
  • A generative dialogue summarization method incorporating common sense knowledge
  • A generative dialogue summarization method incorporating common sense knowledge

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0032] Embodiment 1: This embodiment is a method for generating dialogue summarization incorporating common sense knowledge, including:

[0033] Step 1: Obtain the large-scale commonsense knowledge base ConceptNet and the dialogue summary dataset SAMSum.

[0034] Step 11. Obtain ConceptNet, a large-scale common sense knowledge base:

[0035] Obtain the large-scale common sense knowledge base ConceptNet from http: / / conceptnet.io / ; the common sense knowledge contained in it exists in the form of tuples, that is, tuple knowledge, which can be expressed as:

[0036] R=(h,r,t,w),

[0037] Among them, R represents a tuple of knowledge; h represents the head entity; r represents the relationship; t represents the tail entity; w represents the weight, which represents the confidence of the relationship; knowledge R represents that the head entity h and the tail entity t have the relationship r, and the weight It is w; for example, R=(call, related, contact, 10, indicating that the r...

specific Embodiment approach 2

[0046] Embodiment 2: This embodiment is different from Embodiment 1 in that step 2 uses the acquired large-scale common sense knowledge base ConceptNet to introduce tuple knowledge into the dialogue summary data set SAMSum, and constructs a heterogeneous dialogue graph; The process is:

[0047] Step 21: Obtaining the relevant knowledge of the dialogue; for a dialogue, the present invention first obtains a series of relevant tuple knowledge from ConceptNet according to the words in the dialogue, excludes the noise knowledge, and finally obtains the tuple knowledge set related to the given dialogue, like Figure 4 ;

[0048] Step 22. Build a sentence-knowledge graph:

[0049] For the related tuple knowledge obtained in step 21, suppose there are sentence A and sentence B, word a belongs to sentence A, and word b belongs to sentence B, if the tail entity h of the related knowledge of a and b is consistent, then sentence A and sentence B connected to tail entity h; get sentence...

specific Embodiment approach 3

[0058] Embodiment 3: The difference between this embodiment and Embodiment 1 or 2 is that in step 31, a node encoder is constructed, and a bidirectional long-short-term neural network (Bi-LSTM) is used to obtain a node initialization representation and the word initialization representation The specific process is:

[0059] For the heterogeneous dialogue graph proposed by the present invention in step 2, each node v i contains|v i | words, the word sequence is where w i,n represents node v i the nth word of , n∈[1,|v i |]; use a bidirectional long-short-term neural network (Bi-LSTM) to Generate the forward hidden layer sequence and the backward hidden layer sequence Among them, the forward hidden layer state Backward Hidden State x n means w i,n The word vector representation of ; the initial representation of the node is obtained by splicing the last hidden layer representation of the forward hidden layer state with the first hidden layer representation ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A generative dialogue summarization method incorporating common sense knowledge belongs to the field of natural language processing. The invention solves the problems of inaccurate and low abstraction of the generated dialog summaries due to the fact that the existing method of generating dialog summaries does not utilize common sense knowledge. The method of the invention includes: acquiring the common sense knowledge base ConceptNet and the dialogue summary data set SAMSum; using the acquired common sense knowledge base ConceptNet to introduce tuple knowledge into the dialogue summary data set SAMSum to construct a heterogeneous dialogue graph; practicing the dialogue heterogeneity constructed in step 3 A neural network model that generates a final dialogue summary from a dialogue through a trained dialogue heterogeneous neural network model. The present invention is applied to the generation of dialogue summaries.

Description

technical field [0001] The invention relates to the field of natural language processing, in particular to a method for generating dialogue summarization incorporating common sense knowledge. Background technique [0002] Based on natural language processing - automatic text summarization (AutomaticSummarization) [1] (Title: Constructing literature abstracts by computer:techniques and prospects, author: Chris D Paice, year: 1990, literature cited from Information Processing&Management) in the field of generative dialogue abstract (Abstractive DialogueSummarization), that is, given a multi-person dialogue Transcript, which generates a short text description containing key information about the conversation, such as figure 1 , showing a multi-person conversation and its corresponding standard digest. [0003] For dialogue summarization, most of the existing work focuses on generative (Abstractive) methods, that is, allowing the final summary to contain novel words and phrase...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/332G06F40/295G06N3/04
CPCG06F16/3329G06F40/295G06N3/045G06N3/044
Inventor 冯骁骋冯夏冲秦兵刘挺
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products