Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

End-to-end printed Mongolian recognition translation method based on spatial transformation network

A space transformation and Mongolian language technology, applied in character recognition, natural language translation, neural learning methods, etc., can solve the problems of less research on small languages, the recognition and translation have not achieved good results, and the lack of databases, etc., to improve the accuracy of recognition. rate effect

Active Publication Date: 2021-02-05
INNER MONGOLIA UNIV OF TECH
View PDF10 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] At present, optical character recognition has matured in mainstream languages, and has a good recognition rate for distorted characters, but there are few studies on small languages, especially the serious lack of databases, and the recognition and translation have not achieved good results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • End-to-end printed Mongolian recognition translation method based on spatial transformation network
  • End-to-end printed Mongolian recognition translation method based on spatial transformation network
  • End-to-end printed Mongolian recognition translation method based on spatial transformation network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The implementation of the present invention will be described in detail below in conjunction with the drawings and examples.

[0036] The invention provides an end-to-end printed Mongolian recognition and translation method based on a space transformation network, which includes two steps of character recognition and character translation. Moreover, before text recognition, in order to facilitate the deep learning neural network to better extract features, the data can be preprocessed first. The preprocessing is mainly to analyze and segment the printed Mongolian text.

[0037] Text recognition is realized by end-to-end printing Mongolian recognition network based on space transformation network, refer to figure 1 , the present invention starts from the characteristics of Mongolian characters, and realizes recognition by four stages of space transformation (Trans.), feature extraction (Feat.), sequence modeling (Seq.), and prediction (Pred.), wherein the space transformat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an end-to-end printed Mongolian recognition translation method based on a spatial transformation network. Starting from the characteristics of Mongolian characters, a four-stage character recognition network is used for recognizing the Mongolian characters, and in the spatial transformation stage, the spatial transformation network is used for standardizing Mongolian character pictures; text features are extracted by using a CBAM-GRCNN in combination with context information, and sequence modeling is carried out by using a Morifier LSTM. In the prediction stage, a focusattention mechanism is used for solving the problem of attention drift, and prediction is carried out in combination with a GRU network; in the translation process, a Reformer model is adopted, the Reformer model changes a traditional multi-head attention mechanism into an attention mechanism based on local sensitive hash, a traditional residual network is replaced with a reversible residual network, and the feedforward network is blocked, so that the time and space complexity of the model is reduced, and the problems of insufficient memory and low speed of training long-sequence data are relieved.

Description

technical field [0001] The invention belongs to the technical field of optical character recognition (OCR) and machine translation, in particular to a method for end-to-end printed Mongolian recognition and translation based on a space transformation network. Background technique [0002] Optical character recognition is a technology that uses computers to extract text from pictures into text. It is one of the most effective means to solve the problem of converting pictures to text. Machine translation can convert a language into the target language, which is an effective solution to language barriers. Way. With the development of deep learning, the use of deep learning for optical character recognition and machine translation tasks has become the mainstream. Google, Baidu, Youdao, etc. have conducted a lot of research on optical character recognition and machine translation, and have developed practical application. [0003] Before the emergence of end-to-end character re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/20G06K9/34G06F40/58G06N3/04G06N3/08
CPCG06F40/58G06N3/08G06V10/22G06V30/153G06V10/267G06V30/10G06N3/045
Inventor 苏依拉崔少东程永坤仁庆道尔吉李雷孝石宝
Owner INNER MONGOLIA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products