Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Question answering method and system based on brain-inspired semantic hierarchical temporal memory reasoning model

A Semantic, Temporal Technique Applied to the Field of Cognitive Neuroscience

Active Publication Date: 2021-02-02
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above problems in the prior art, that is, to solve the small-sample learning problems of natural language understanding tasks such as text generation and automatic question-and-answer based on human-like pattern recognition algorithms, the present invention provides a temporal memory reasoning based on brain-like semantic hierarchy Question answering methods for the model, including:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Question answering method and system based on brain-inspired semantic hierarchical temporal memory reasoning model
  • Question answering method and system based on brain-inspired semantic hierarchical temporal memory reasoning model
  • Question answering method and system based on brain-inspired semantic hierarchical temporal memory reasoning model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0073] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0074] Traditional neural networks, contrary to the process of human learning knowledge, require a large amount of data and continuously optimize the model, while the HTM model (Hierarchical Temporal Memory, semantic hierarchical temporal memory model) is rarely used in natural langu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of cognitive neuroscience, particularly relates to a question and answer method and system based on a brain-like semantic hierarchical memory reasoning model, and aims to solve the problem of small sample learning of natural language understanding tasks such as text generation and automatic question and answer. The method comprises the steps of acquiring and inputting a question text and an answer text; Performing time sequence pooling on the text to obtain a word vector matrix; Pooling the space and time of each word vector in the word vector matrix to obtain a binary word representation set of which each bit is 0 or 1 corresponding to the word vector; Performing brain-like learning on the text and the word set to obtain an optimized model; And independently inputting the question text, performing word reduction based on the cell prediction state in the model, obtaining an answer text, and outputting the answer text. According to the method, a semantic hierarchical time sequence memory model is combined, the model is constructed based on a learning mode of small sample data and knowledge reasoning, the requirement for the number of samples is low, a large number of parameters do not need to be adjusted, and the expandability of the model is improved.

Description

technical field [0001] The invention belongs to the field of cognitive neuroscience, and in particular relates to a question answering method and system based on a brain-inspired semantic hierarchical temporal memory reasoning model. Background technique [0002] Although traditional neural networks can better solve pattern recognition problems including images, speech or text, they often require multiple rounds of iterative training with a large amount of data, which does not match the process of human learning knowledge. When humans learn to recognize images and memorize text, they often do not need a lot of repeated training, and human learning is an online learning process. When faced with new knowledge, humans will make corresponding reasoning based on previously acquired knowledge Compared with analogy, it can learn new knowledge faster. In contrast, traditional neural network algorithms have achieved good results in some pattern recognition tasks, but when faced with ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/33G06F16/332G06F16/36G06N5/04
CPCG06N5/04
Inventor 王寓巍张铁林曾毅
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products