Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Regional convolutional neural network-based method for gesture identification and interaction under egocentric vision

A technology of convolutional neural network and gesture recognition, which is applied in the direction of neural learning method, biological neural network model, character and pattern recognition, etc. It can solve the problems of algorithm model recognition rate speed, direction, hand size, etc., to improve the recognition speed The effect of accuracy, gesture recognition rate stability, and long distance

Active Publication Date: 2017-09-15
SOUTH CHINA UNIV OF TECH
View PDF5 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These traditional method models usually require artificially predefined features, such as scale-invariant feature transformation (SIFT), Chen features (SURF), histogram of oriented gradients (HOG), Fourier descriptors and other hand-shaped feature descriptions and based on optical flow, The gesture motion information description of the motion tracking method, the method of manual selection of features has great limitations, usually requires prior knowledge, experience and a lot of manual adjustments, and the recognition rate of the algorithm model is easily affected by the gesture operation speed, direction, and hand size. difference has a big impact on

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Regional convolutional neural network-based method for gesture identification and interaction under egocentric vision
  • Regional convolutional neural network-based method for gesture identification and interaction under egocentric vision
  • Regional convolutional neural network-based method for gesture identification and interaction under egocentric vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0020] like figure 1 As shown, the first-view gesture recognition and interaction method based on the regional convolutional neural network of the present invention includes the following steps:

[0021] S1. Acquire training data, and manually calibrate the labels of the training data. The labels include the upper left and lower right corners of the foreground of the hand area, the coordinates of the skeleton nodes of different gestures, and different gesture categories manually marked.

[0022] When acquiring data, the camera is placed at the position of the human eye, and the visual direction is consistent with the direct gaze direction of the eyes. Continuously collect video stream information and convert it into RGB images. The images include various gestures (such as figure 2 shown in a-f). Wherein, the camera is an ordinary 2D camera, and the collected image is an ordinary RGB image with a size of 640*480. The training data includes a variety of different gestures, an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a regional convolutional neural network-based method for gesture identification and interaction under egocentric vision. The method comprises the following steps of S1, obtaining training data; S2, designing a regional neural network which is used for gesture classification and fingertip detection while being used for hand detection, and ensuring that an input of the neural network is a three-channel RGB image and outputs of the neural network are top left corner coordinates and top right corner coordinates of an external connection matrix of a gesture region, gesture types and gesture skeleton key points; and S3, judging the gesture types, and outputting corresponding interactive results according to different interactive demands. The invention provides a complete method for gesture identification and interaction under egocentric vision. Through single model training and partial network sharing, the identification speed and accuracy of the gesture identification under the egocentric vision are increased and improved.

Description

technical field [0001] The invention belongs to the technical field of computer vision and machine learning, and in particular relates to a first-view gesture recognition and interaction method based on a regional convolutional neural network. Background technique [0002] With the introduction of many virtual reality (Virtual Reality, VR) and augmented reality (Augmenting Reality, AR) equipment products, the enthusiasm for research on human-human interaction has increased a lot. Looking at the whole body of the human body, the hand has been considered as the most important and common interaction tool. At the same time, in the field of human-computer interaction, gesture recognition has become a very important research direction. Due to the rise of AR and VR, Egocentric Vision gesture interaction technology has gradually attracted everyone's attention, and more and more scholars and companies have invested manpower and material resources in corresponding research and develo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06N3/08
CPCG06F3/017G06N3/08G06F2203/012G06V40/28
Inventor 郑晓旭徐向民殷瑞祥蔡博仑
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products