Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Character translation and display method and device based on augmented reality and electronic equipment

A technology of augmented reality and display method, applied in the field of computer vision, can solve problems such as low convenience, accurate and intuitive one-to-one correspondence replacement of translated text, complex operation of photo-translation process, etc., so as to improve user-friendliness and display effect Realistic, Convenience-Enhancing Effects

Inactive Publication Date: 2020-05-26
西安欧思奇软件有限公司
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, when translating the text in the image through the above method, since the entire translation process needs to be carried out step by step, that is, the user needs to perform a photo operation to obtain an image with text, and then realize the recognition and translation of the text in the image through the translation engine. That is, the user needs to go through two steps to get the translated text, and since the translated text is displayed in the translation result display area, that is, the translated text and the text in the image are displayed separately, the translated text is separated from the image display, and the user is viewing the translation When translating text, the translated text cannot be replaced directly with the original text in an accurate and intuitive one-to-one correspondence
[0004] Therefore, the defects in the prior art are: the operation of the photo-translation process is relatively complicated, and the convenience is not high, and because the translated text is separated from the image display, when the user views the translated text, the translated text cannot be directly compared with the original text accurately and intuitively. One-to-one replacement, so the display method is not very user-friendly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Character translation and display method and device based on augmented reality and electronic equipment
  • Character translation and display method and device based on augmented reality and electronic equipment
  • Character translation and display method and device based on augmented reality and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] The embodiment of the present application provides a text translation and display method based on augmented reality, such as figure 1 As shown, the method includes:

[0059] Step S101, identifying characters in the image captured by the image acquisition device of the terminal device, and determining position information of the recognized characters in the image plane coordinate system.

[0060] Among them, the image acquisition device can collect images in real time and recognize the text in the image; in actual application scenarios, by starting the camera of the mobile device, the image acquisition viewfinder can be aligned with the text in the image without other shooting, The operation of the camera automatically recognizes the text in the image, and at the same time determines the position information of the text in the image plane coordinate system, which can further meet the needs of users for instant shooting and flipping in different scenarios.

[0061] Step ...

Embodiment 2

[0069] The embodiment of the present application provides another possible implementation. On the basis of the first embodiment, it also includes the method shown in the second embodiment, wherein, in step S101, the image collected by the image acquisition device of the terminal device is identified text, including:

[0070] The text in the image collected by the image collection device of the terminal device is recognized by the optical character recognition algorithm OCR.

[0071] Further, the text in the image collected by the image collection device of the terminal device is recognized through the optical character recognition algorithm OCR, specifically including:

[0072] Binarize the image;

[0073] Divide the binarized image into multiple blocks;

[0074] extracting feature information of each word block, and matching the extracted feature information with a feature database, and determining the matching result as the recognition result of each word block;

[0075] ...

Embodiment 3

[0120] The present application provides a text translation and display device 40 based on augmented reality, such as Figure 4 As shown, the device may include: an image recognition module 401, a text translation module 402, an overlay position determination module 403, and a display module 404, wherein,

[0121] The image recognition module 401 is configured to recognize the text in the image collected by the image collection device of the terminal device, and determine the position information of the recognized text in the image plane coordinate system.

[0122] The text translation module 402 is configured to translate the recognized text into a target translated text in the target language.

[0123] The superimposition position determination module 403 is configured to determine the superimposition position in the physical three-dimensional space coordinate system corresponding to the augmented reality display information of the target translated text according to the posi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a character translation and display method based on augmented reality, and the method comprises the steps of recognizing a character in an image collected by an image collectiondevice of terminal equipment, and determining the position information of the recognized character in an image plane coordinate system; translating the recognized characters into target translated characters of a target language; according to the position information of the character in the image plane coordinate system, determining a superposition position of the target translation character ina physical three-dimensional space coordinate system corresponding to the augmented reality display information; and displaying the augmented reality display information of the target translated character at the superposition position in an augmented reality mode in a superposition manner. According to the method and the device, the augmented reality display information of the target translated character is displayed at the superposition position through the augmented reality technology, so that the augmented reality display information is seamlessly fit and displayed relative to the characterin the image, and the display effect is more vivid. The invention further provides a character translation and display device based on augmented reality and electronic equipment.

Description

technical field [0001] The present application relates to the technical field of computer vision, in particular, the present application relates to a text translation and display method, device and electronic equipment based on augmented reality. Background technique [0002] The text recognition and translation technology in the prior art usually realizes the translation of the text through a translation engine. The specific process is: the user performs a photo operation to obtain an image with text, and inputs the image to the translation engine, and the translation engine interprets the text in the image. The text is translated to obtain the translated text, and finally the translated text is displayed in the translation result display area. [0003] However, when translating the text in the image through the above method, since the entire translation process needs to be carried out step by step, that is, the user needs to perform a photo operation to obtain an image wit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/58G06K9/00G06K9/20G06K9/34G06K9/46G06F3/01
CPCG06F3/011G06V20/64G06V30/40G06V10/22G06V10/267G06V30/153G06V10/56G06V30/10
Inventor 张乐杰李玉峰
Owner 西安欧思奇软件有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products