Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

IPT simulation training gesture recognition method based on binocular vision

A technology of gesture recognition and binocular vision, which is applied in the field of computer vision/human-computer interaction, can solve problems such as differences in real interactive operations, reduction in recognition accuracy, loss of target gestures, etc., to reduce training costs, improve operation training efficiency, and improve usage The effect of experience

Active Publication Date: 2020-01-14
QINGDAO RES INST OF BEIHANG UNIV +1
View PDF8 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the core components of data gloves are usually quite expensive, and they need to be worn to operate the machine, and the interaction after the user wears it is extremely unnatural, which is quite different from the real interactive operation, and cannot bring a good user experience to the operator
However, most of the existing vision-based recognition technologies are based on monocular cameras. When the user moves violently, it is easy to cause the loss of the target gesture in the foreground segmentation image. The tracking of gestures based solely on color is easily affected by the ambient light. The problem of tracking loss, which in turn will lead to a decrease in recognition accuracy
[0004] In the IPT (Integrated Procedures Trainer, an important training device in the aerospace field) system, the traditional tactile interactive operation can only simulate the operation sequence, and cannot simulate the operating environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • IPT simulation training gesture recognition method based on binocular vision
  • IPT simulation training gesture recognition method based on binocular vision
  • IPT simulation training gesture recognition method based on binocular vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] In order to understand the above-mentioned purposes and advantages of the present invention more clearly, the specific embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings:

[0067] This embodiment proposes a binocular vision-based IPT simulation training gesture recognition method, adopts binocular vision-based recognition technology for gesture recognition, and makes innovative improvements to gesture segmentation, gesture tracking and gesture recognition in the process of visual gesture recognition. A gesture segmentation method based on depth images and background modeling, a Level Sets gesture tracking method based on depth features, and a hand shape classification method based on extended Haar-like features and improved Adaboost are proposed respectively. Specifically, such as figure 1 As shown, the functional block diagram of the gesture recognition method described in this embodiment:

[0068] Step...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an IPT simulation training gesture recognition method based on binocular vision, and the method comprises the steps: improving an interaction mode according to the characteristics of a VR application, and carrying out the innovative improvement in three aspects: gesture segmentation, gesture tracking and gesture recognition; during gesture segmentation, compensating a target loss phenomenon in a foreground segmentation image caused by motion blur through background modeling, target motion trend estimation and other methods; during gesture tracking, using the combinationof chrominance information and depth information, realizing tracking of a motion gesture during operation of the flight simulator through contour fitting; during gesture recognition, providing and using an extended Haar-like feature and an improved Adaboost method for hand shape classification, and defining a state area. The good dynamic gesture recognition rate is obtained, the use experience during operation is improved, an operator can accurately check the hand action during operation, the operation training efficiency is effectively improved, and the training cost is reduced.

Description

technical field [0001] The invention belongs to the field of computer vision / human-computer interaction, and in particular relates to a binocular vision-based IPT simulation training gesture recognition method. Background technique [0002] At present, gesture recognition technology is mainly divided into data glove-based recognition technology and vision-based recognition technology. Data glove-based recognition technology uses sensor devices to convert hand and finger movement information into computer-understandable control commands. This technology collects gestures and gesture movement data through additional sensor devices; vision-based technology uses cameras to collect human Gestures are converted into computer-understandable commands through video image processing and understanding technology to achieve human-computer interaction effects, and natural interaction with computers can be achieved without wearing any additional equipment. [0003] However, the core comp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06T7/11G06F3/01
CPCG06T7/11G06F3/017G06V40/28G06V20/42G06F18/214Y02D10/00
Inventor 严小天于洋王慧青刘训福田学博
Owner QINGDAO RES INST OF BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products