Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-computer interaction method and device based on integration of eye movement tracking and gesture recognition in virtual assembly

A gesture recognition and virtual assembly technology, applied in the field of human-computer interaction, can solve the problems of naturalness and lack of accuracy, and achieve the effect of improving accuracy and naturalness, reducing complexity, and eliminating redundant information

Active Publication Date: 2019-10-22
微晶数实(山东)装备科技有限公司
View PDF3 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these interaction methods have certain deficiencies in the naturalness and accuracy of the interaction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method and device based on integration of eye movement tracking and gesture recognition in virtual assembly
  • Human-computer interaction method and device based on integration of eye movement tracking and gesture recognition in virtual assembly
  • Human-computer interaction method and device based on integration of eye movement tracking and gesture recognition in virtual assembly

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] The present disclosure will be further described below in conjunction with the accompanying drawings and embodiments.

[0054] It should be noted that the following detailed description is exemplary and intended to provide further explanation of the present disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.

[0055] It should be noted that the terminology used herein is only for describing specific embodiments, and is not intended to limit the exemplary embodiments according to the present disclosure. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combinations thereof.

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human-computer interaction method and a human-computer interaction device based on the integration of eye movement tracking and gesture recognition in a virtual assembly. Themethod comprises the steps: carrying out the gaze point tracking according to the obtained eye movement data; performing gesture recognition according to the obtained gesture information, labeling theobtained gesture recognition data and eye movement data to form a training set, and constructing a multi-stream convolutional neural network-long-term and short-term memory network model, wherein thenetwork model performs self-learning by using the training set; and applying the optimal network model obtained by training to a virtual assembly process, obtaining eye movement data and gesture information of the virtual assembly process, extracting eye movement and gesture features, and analyzing according to the feature information to obtain a behavior category of an operator so as to completean assembly task. The problem of misjudgment of similar behaviors in a single mode is solved. The advantages of a deep learning algorithm are utilized. The behaviors of operators in a video are recognized with high accuracy. A virtual assembly task is completed, and the man-machine interaction is achieved.

Description

technical field [0001] The disclosure belongs to the technical field of human-computer interaction, and in particular relates to a human-computer interaction method and device that integrate eye tracking and gesture recognition in virtual assembly. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] With the development of information technology, human-computer interaction is gradually changing from machine-centered to human-centered, that is, in a virtual environment, machines are used as auxiliary tools to assist humans in completing interactive tasks. At the same time, virtual reality technology is also developing rapidly. As an application of virtual reality technology, virtual assembly is widely used in product design verification and computer-aided assembly planning. Virtual assembly uses technologies such as data processing, human-com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06K9/62
CPCG06F3/013G06F3/017G06F2203/012G06V40/107G06V40/28G06V40/197G06F18/241
Inventor 杨晓晖察晓磊徐涛冯志全吕娜范雪
Owner 微晶数实(山东)装备科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products