The invention discloses a multi-turn
dialog management method for hierarchical attention LSTM and
knowledge graph, and belongs to the field of
natural language processing. The method has the core idea: taking conversation contents of the user and the
system in the conversation as a context, extracting the context deep semantic through important and timing information of the word and
sentence level, specifically in two steps, firstly extracting
sentence semantics at the word level by utilizing the first attention mechanism LSTM, and then extracting context
semantics through the second attentionmechanism LSTM at the
sentence level; the attention mechanism keeps important information, and the attention mechanism is realized through the
knowledge graph as external knowledge, the LSTM retainstiming information that collectively identifies the
user intent and the recognition result is used to determine whether to open the next session. According to the multi-turn
dialog management method for hierarchical attention LSTM and
knowledge graph, the knowledge graph and the LSTM are utilized to learn the contextual deep
semantics, the attention mechanism is utilized to filter out useless information, and therefore the user intention identification efficiency and accuracy are improved.