Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A dynamic gesture recognition method, gesture interaction method and interaction system

A dynamic gesture and dynamic technology, applied in the field of computer vision, can solve the problems of complex installation, cumbersome process, high computing load, and achieve the effect of improving speed and accuracy, simple and fast processing, accurate and efficient human-computer interaction

Active Publication Date: 2022-07-08
OMNIVISION SENSOR SOLUTION (SHANGHAI) CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method is limited by the influence of the data frame rate, and the response time is long. At the same time, due to the many steps and cumbersome process, this method has a high computational load and high power consumption during algorithm processing.
In addition, the traditional human-computer interaction system needs to be integrated with the hardware platform, and the installation is more complicated

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A dynamic gesture recognition method, gesture interaction method and interaction system
  • A dynamic gesture recognition method, gesture interaction method and interaction system
  • A dynamic gesture recognition method, gesture interaction method and interaction system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided so that the present disclosure will be more thoroughly understood, and will fully convey the scope of the present disclosure to those skilled in the art.

[0036] In recent years, Dynamic Vision Sensor (DVS) has received more and more attention and applications in the field of computer vision. DVS is a biomimetic vision sensor that mimics the human retina based on pulse-triggered neurons. Inside the sensor, there is a pixel unit array composed of multiple pixel units, in which each pixel unit responds and records the area of ​​rapid change of light intensity only when it sense...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dynamic gesture recognition method, a gesture interaction method and an interaction system. Wherein, the interaction system includes: a dynamic vision sensor, suitable for triggering an event based on the relative motion of the object in the field of view and the dynamic vision sensor, and outputting the event data stream to the hand detection module; the hand detection module, suitable for analyzing the event data stream processing to determine the initial position of the hand; a hand tracking module, suitable for determining a series of state vectors indicating the movement state of the hand in the event data stream based on the initial position of the hand; a gesture recognition module, suitable for obtaining The event data pointed to by the state vector, construct the event cloud, and use the point cloud-based neural network to process the event cloud to identify the gesture category; the command response module is suitable for executing the corresponding gesture category based on the identified gesture category. operating instructions. The present invention also discloses a corresponding computing device.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a human-computer interaction method based on dynamic gesture recognition. Background technique [0002] Human-computer interaction system completes the functions of information management, service and processing for people to the greatest extent by studying the mutual understanding and communication between people and computers. From the initial interaction based on traditional hardware devices, such as mouse and keyboard, to today's computer vision-based human-computer interaction system, the development of human-computer interaction is very rapid. [0003] Among them, the human-computer interaction system based on computer vision is typically based on human-computer interaction based on speech recognition, the most representative of which is Siri launched by Apple; the most representative of human-computer interaction based on action recognition is Microsoft's Kinect; ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06V10/764G06V10/82G06K9/62G06N3/02G06F3/01
CPCG06N3/02G06F3/017G06V40/28G06F18/24Y02D10/00
Inventor 施顺顺武斌蒋睿王沁怡张伟宫超
Owner OMNIVISION SENSOR SOLUTION (SHANGHAI) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products