Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Graph-Based Bilingual Recurrent Autoencoder

An autoencoder, bilingual technology, applied in instrumentation, semantic analysis, natural language translation, etc., can solve problems such as lack of consideration of semantic constraints

Inactive Publication Date: 2019-07-09
XIAMEN UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional method only considers the reconstruction error and the semantic correspondence of bilingual phrases in the modeling process, but lacks more adequate consideration of the semantic constraint relationship
Therefore, there are still deficiencies in the existing methods, and how to learn better bilingual phrase embedding representations is still a problem worth studying

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Graph-Based Bilingual Recurrent Autoencoder
  • Graph-Based Bilingual Recurrent Autoencoder
  • Graph-Based Bilingual Recurrent Autoencoder

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0035] The first step is to extract bilingual phrases from the parallel corpus as training data, and calculate the translation probability between bilingual phrases.

[0036] The second step is to calculate the probability of retelling based on the pivot language method.

[0037] The third step is to construct a semantic relationship graph of bilingual phrases. Taking the source phrase and target phrase as nodes, for any source phrase and target phrase, if it belongs to a phrase pair in the bilingual phrase corpus, a link is constructed. All node sets and edge sets constitute the semantic relationship graph of the corresponding bilingual phrase.

[0038] The fourth step is to define two implicit semantic constraints based on the semantic relationship diagram of bilingual phrases. For two different nodes in the same language, if they are connected to the same set of nodes in another language, they are considered to be close to each other in semantic space, which is constraint one. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a bilingual recursion auto-encoder based on a graph, and relates to natural language processing based on deep learning. Bilingual phrases are extracted from parallel corpora as training data to calculate a translation probability between the bilingual phrases; on the basis of a method of pivot language, a retelling probability is calculated; a semantic relation graph for the bilingual phrases is constructed; and on the basis of the semantic relation graph for the bilingual phrases, a model target function is quantified, and model parameter training is carried out. A purpose of favorably learning the bilingual phrase embedding representation is taken as a target, and the bilingual recursion auto-encoder based on the graph is put forward by aiming at a situation that a traditional method lacks the consideration of a more sufficient semantic constraint relation in natural language. The bilingual recursion auto-encoder has the advantages of specific algorithm and clear thought, the learnt bilingual phrase embedding representation can be improved so as to more favorably act to a natural language processing task. Firstly, the semantic relation graph for the bilingual phrases is constructed, and two implicit semantic constraints are defined through a graph structure for learning more accurate bilingual phrase embedding representation so as to be better applied to the natural language processing tanks, such as machine translation.

Description

Technical field [0001] The present invention relates to natural language processing based on deep learning, in particular to a bilingual recursive autoencoder based on graphs. Background technique [0002] Natural language processing is an important research direction of artificial intelligence in computer science. It studies how to enable effective communication between humans and computers using natural language. It is a subject integrating linguistics, computer science, and mathematics. [0003] The present invention mainly involves the construction of a graph-based bilingual recursive autoencoder and its use in bilingual phrase embedding representation modeling. A neural network is a mathematical model that uses a synaptic connection structure similar to the brain's nerves to process information. In recent years, the research of natural language processing based on neural networks has become the main trend in the development of this discipline, and various neural networks ha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F17/27G06F17/28
CPCG06F40/30G06F40/44
Inventor 苏劲松殷建民宋珍巧阮志伟
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products