Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture error correction method, system and device in augmented reality environment

A technology of augmented reality and error correction method, which is applied in the field of information processing, can solve the problems of insufficient timeliness and accuracy, and achieve the effect of improving gesture recognition rate and error correction efficiency

Active Publication Date: 2019-07-09
UNIV OF JINAN
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present application provides a gesture error correction method, system and device in an augmented reality environment to solve the problem that the timeliness and accuracy of the gesture error correction method in the prior art are not high enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture error correction method, system and device in augmented reality environment
  • Gesture error correction method, system and device in augmented reality environment
  • Gesture error correction method, system and device in augmented reality environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0059] see figure 1 , figure 1 It is a schematic flowchart of a gesture error correction method in an augmented reality environment provided by an embodiment of the present application. Depend on figure 1 It can be seen that the gesture error correction method in the augmented reality environment in this embodiment mainly includes the following processes:

[0060] S0: Obtain the first gesture depth map and gesture depth information of the human hand.

[0061] Wherein, the first gesture depth map is an original gesture depth map, and the first gesture depth map includes a static gesture depth map and a dynamic gesture depth map, and the gesture depth information includes joint point coordinates.

[0062] In this embodiment, the kinect device can be used to obtain the first gesture depth map and gesture depth information of the human hand, and the depth coordinates of the bone nodes of the human hand can be obtained through the bone information of the kinect.

[0063] S1: Pr...

Embodiment 2

[0105] exist figure 1 On the basis of the illustrated embodiment see figure 2 , figure 2 It is a schematic structural diagram of a gesture error correction system in an augmented reality environment provided by the embodiment of the present application. Depend on figure 2 It can be seen that the gesture error correction system in this embodiment mainly includes: an information acquisition module, a preprocessing module, a first input module, a judgment module, a minimum characteristic distance determination module, a third gesture depth map determination module and a second input module.

[0106] Wherein, the information acquisition module is used to acquire the first gesture depth map and gesture depth information of human hands, the first gesture depth map is the original gesture depth map, and the first gesture depth map includes static gesture depth map and dynamic gesture depth map, gesture depth map Depth information includes joint point coordinates. The preproces...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture error correction method, system and device in an augmented reality environment. The method comprises the following steps: firstly, acquiring a first gesture depth mapand gesture depth information of a hand; judging whether the current gesture is recognizable or not by utilizing a gesture recognition model; when the gesture depth map cannot be identified, determining the minimum feature distance of the current second gesture depth map according to the Hausdorff distance, determining a third gesture depth map through the Hausdorff distance, and finally, inputting the third gesture depth map into the gesture identification model, so that gesture error correction is realized. The system comprises an information acquisition module, a preprocessing module, a first input module, a judgment module, a minimum feature distance determination module, a third gesture depth map determination module and a second input module. The device comprises a processor and a memory connected with the processor, and the processor can execute the gesture error correction method. Real-time performance and accuracy of gesture error correction can be greatly improved.

Description

technical field [0001] The present application relates to the technical field of information processing, in particular to a gesture error correction method, system and device in an augmented reality environment. Background technique [0002] Augmented reality is also called AR (Augmented Reality). With the development of human-computer interaction technology, augmented reality, as a technology to enhance the interactive ability between reality and virtual reality, has attracted more and more attention. In the current augmented reality technology, most of them are based on wearable devices such as data gloves or gesture recognition sensors to complete the fusion of real hands and virtual scenes. In the process of gesture recognition, there will be some inaccurate recognition. Gesture error correction in technology is an important issue. [0003] At present, a commonly used gesture error correction method is a recognition method based on the Hidden Markov Model (HMM). Specifi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/64G06V40/28G06F18/22G06F18/214
Inventor 冯志全肖梦婷
Owner UNIV OF JINAN
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products