Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Augmented reality virtual keyboard input method and apparatus using same

A virtual keyboard input and augmented reality technology, applied in the input/output of user/computer interaction, computer components, graphics reading, etc., can solve problems such as inability to find, slow response speed, further improvement in adaptability and robustness, etc. question

Active Publication Date: 2016-01-06
王登高
View PDF9 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition, if the operator does not want to output a common word but a special abbreviation, the word may not be found in the thesaurus
Even if the options are comprehensive, it can slow down input as the operator needs to choose from a large number of results
However, if the system automatically judges the current result based on the previous output of the operator, not only will the response speed be reduced, but the result may also be inaccurate; in addition, each operator has different keystroke habits, which will lead to different operators. Need to get used to the default range of the program, for example, how far the right index finger moves from the starting position to the display is the number 7, not U
[0018] There is also a patent for virtual keyboard technology that is judged by static gesture information and finger curvature, but requires operator training, and the adaptability and robustness need to be further improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Augmented reality virtual keyboard input method and apparatus using same
  • Augmented reality virtual keyboard input method and apparatus using same
  • Augmented reality virtual keyboard input method and apparatus using same

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0077] Such as figure 1 Shown is the flowchart of Embodiment 1 of the present invention.

[0078] Specifically, whenever a simulated keystroke is detected, first calculate the distance between the positions of those keys that meet certain rules on the virtual keyboard and the keystroke position, and save the values ​​of these distances until the operator is identified Make the end input action and then calculate the sorting result according to the value of the distance according to the sorting rules;

[0079] The flowchart includes:

[0080] Step 101 activates the input box or opens the program, and waits for input.

[0081] Specifically, activating the input box means reactivating the input box to enter the input state when the program is already opened, or automatically activating the input box to enter the input state after opening the program. Other ways to enter the input state can also be used.

[0082] Step 102: The analysis module determines the placement position ...

Embodiment 2

[0220] Such as figure 2 Shown is the flowchart of Embodiment 2 of the present invention,

[0221] Specifically, whenever a simulated keystroke is detected, first calculate the distance between the positions of those keys that meet certain rules on the virtual keyboard and the keystroke position, and save the values ​​of these distances until the operator is identified Make the end input action and then calculate the sorting result according to the value of the distance according to the sorting rules;

[0222] The flowchart includes:

[0223] Step 201 activates the input box or opens the program, and waits for input.

[0224] Specifically, the same as step 101.

[0225] Step 202: The analysis module determines the placement position and placement angle of the virtual keyboard in the real scene according to the keyboard movement data captured by the detection module, or determines the placement position and placement angle of the virtual keyboard according to preset settings...

Embodiment 3

[0257] Such as image 3 Shown is the flow chart of Embodiment 3 of the present invention.

[0258] Specifically, after each simulated keystroke action is completed, the value of the above-mentioned distance obtained according to this simulated keystroke action and the value of the above-mentioned distance obtained from the previous simulated keystroke actions according to the sorting rules Analyze and update the candidates so that the operator can choose to confirm the input, if the operator does not select these candidates to confirm the input, but continues to make simulated keystrokes, then this process is repeated by step 305 until the operator selects The input is completed only after the candidate item is confirmed and inputted. In the process of this embodiment, there is no step of detecting, analyzing and judging whether the operator has made an action to end the input.

[0259] The flowchart includes:

[0260] Step 301 activates the input box or opens the program, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses an augmented reality virtual keyboard input method and an apparatus using the same. The method comprises: according to a gesture action by which an operator locate a virtual keyboard or presetting, determining placing location and angle of the virtual keyboard; by an augmented reality technology, superposing the virtual keyboard into a real scene, so that the virtual keyboard induces the action of the operator; and analyzing the gesture action by which the operator operates the virtual keyboard to obtain an input result and feeding back the input result to an operating system or an application. An operation mode is simple. During inputting, a user does not need to make any voice and wear a ring device, and no integration of projection equipment is required. The method and the apparatus can adapt to various input gestures, and no training system is required. The method and the apparatus can be suitable for various devices which need keyboards to input information, such as mobile phones, single-chip microcomputer devices, intelligent furniture, intelligent electric appliances and computers, especially wearable devices that perform input by using keyboards and touch screens inconveniently, such as intelligent glasses. The method and the apparatus greatly expand the operation performance of an intelligent device during inputting by using the virtual keyboard.

Description

technical field [0001] The invention relates to the technical field of virtual keyboard input, in particular to an augmented reality virtual keyboard input method and a device using the method. The present invention can be used in devices that use a keyboard to input information, such as mobile smart devices, wearable smart devices, single-chip microcomputer devices, smart furniture, smart appliances, personal computers, medium-sized computers, and large-scale computers. Background technique [0002] There are two types of portable smart devices today, one is smart devices that are suitable for using touch screens as input solutions (such as tablet computers, smart phones with touch screens), and the other is smart devices that are not suitable for using touch screens as input solutions (For example, wearable smart devices represented by smart glasses and smart watches). [0003] Wearable smart devices are inconvenient to operate through the touch screen, and voice recognit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01
Inventor 王登高
Owner 王登高
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products