Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture recognition device and man-machine interaction system

A gesture recognition and gesture technology, applied in the field of human-computer interaction systems, can solve problems such as low efficiency and complex structure of recognition devices, and achieve the effect of improved recognition efficiency and simple structure

Inactive Publication Date: 2018-08-28
HONG FU JIN PRECISION IND (SHENZHEN) CO LTD +1
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, since a specialized neural network is required to judge the position of the hand in the image, the recognition device has a complex structure and low efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition device and man-machine interaction system
  • Gesture recognition device and man-machine interaction system
  • Gesture recognition device and man-machine interaction system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] see figure 2 , in this embodiment, the gesture recognition device 11 includes: a control module 110 , a gesture sensing module 111 , a calculation module 112 , a gesture recognition module 113 and a communication module 114 . The gesture sensing module 111 , computing module 112 , gesture recognition module 113 and communication module 114 are respectively connected to the control module 110 .

[0030] The control module 110 controls the entire gesture recognition device 11 to work. The gesture sensing module 111 is used to collect position data of user gestures. The gesture sensing module 111 includes a 3D sensing device. The 3D sensing device may be any 3D sensing device, such as an infrared sensing device, a laser sensing device or an ultrasonic sensing device. In this embodiment, the 3D sensing device is leap motion. The calculation module 112 is used for analyzing and processing the position data and other data of the user gesture. The gesture recognition mod...

Embodiment 2

[0044] see Figure 5 , in this embodiment, the gesture recognition device 11A includes: a control module 110, a gesture sensing module 111, a calculation module 112, a gesture recognition module 113, a communication module 114, a first judgment module 115 and a The second judging module 116 .

[0045] The gesture recognition device 11A in Embodiment 2 of the present invention is basically the same in structure as the gesture recognition device 11 in Embodiment 1 of the present invention, the difference being that it further includes a second judging module 116 . The second judging module 116 is used for judging whether a gesture input start instruction or a gesture input end instruction is received.

[0046] The method for the second judging module 116 to judge whether a gesture input start instruction or a gesture input end instruction is received may be: judging whether the communication module 114 has received an instruction from an external remote control device, or by ju...

Embodiment 3

[0060] see Figure 7 , in this embodiment, the gesture recognition device 11B includes: a control module 110, a gesture sensing module 111, a calculation module 112, a gesture recognition module 113, a communication module 114, a first judgment module 115, a The second judging module 116 and a third judging module 117 .

[0061] The gesture recognition device 11B in Embodiment 3 of the present invention is basically the same in structure as the gesture recognition device 11A in Embodiment 2 of the present invention, the difference is that it further includes a third judging module 117, which is used to judge whether the gesture input mode is 2D input mode or 3D input mode.

[0062] The method for the third judging module 117 to judge whether the gesture input mode is 2D input mode or 3D input mode may be: judging whether the communication module 114 receives an instruction from an external remote control device, or judging whether the gesture recognition module 113 detects a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a gesture recognition device comprising a control module; a gesture sensing module which is used for collecting the position data of the user gesture; a computing module whichis used for analyzing and processing the position data of the user gesture and other data; a gesture recognition module which is used for recognizing the user gesture according to the position data of the user gesture; and a communication module, wherein the gesture sensing module comprises a 3D sensing device.

Description

technical field [0001] The invention relates to the technical field of computer vision recognition, in particular to a gesture recognition device and a human-computer interaction system using the gesture recognition device. Background technique [0002] Machine learning has promoted the research on pattern recognition and computer learning theory of artificial intelligence. Deep learning (deep learning) is a branch of machine learning, which is based on a series of mathematical algorithms (algorithm), trying to pass a deep graph (deep graph), multiple processing layers (processing layers) and multiple linear and nonlinear Transformation to mimic high-level abstractions in data. With the rapid development of technology, deep learning is widely used, such as cloud computing, medicine, media security, and automatic vehicles. [0003] In addition to artificial intelligence, virtual reality (virtual reality) and augmented reality (augmented reality) are also currently booming t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06F3/01
CPCG06F3/017G06V20/64G06V40/107G06V40/28
Inventor 魏崇哲
Owner HONG FU JIN PRECISION IND (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products