Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional gesture attitude estimation method

A posture estimation and three-dimensional gesture technology, applied in the field of computer vision and deep learning, can solve the problems of low recognition accuracy, many convolutional neural network training parameters, slow training and testing speed, etc., to achieve accurate gesture recognition and improve training speed , the effect of parameter reduction

Active Publication Date: 2017-08-29
SHENZHEN INST OF FUTURE MEDIA TECH +1
View PDF4 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The main purpose of the present invention is to propose a three-dimensional gesture pose estimation method based on a depth map and a fully convolutional neural network to overcome the low recognition accuracy and volume of existing gesture recognition methods using a deep convolutional neural network. The problem of many training parameters and slow training and testing speed of the product neural network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional gesture attitude estimation method
  • Three-dimensional gesture attitude estimation method
  • Three-dimensional gesture attitude estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The present invention will be further described below in conjunction with the accompanying drawings and preferred embodiments.

[0034] The specific embodiment of the present invention proposes a method for estimating a three-dimensional gesture posture, which includes the following steps S1 to S6:

[0035] S1. Obtain multiple gesture depth maps, and segment the gesture foreground and background of each gesture depth map to obtain multiple gesture foreground maps and randomly divide them into a training set and a test set. The specific process of step S1 includes: using a depth camera to take multiple gesture depth maps of different people and performing affine transformation processing so that the multiple gesture depth maps have the same size; using a random forest classifier to perform gesture foreground and background The segmentation of multiple gesture foreground images is obtained; the multiple gesture foreground images are randomly divided into a large number of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional gesture attitude estimation method, comprising: S1) obtaining a plurality of gesture depth images and performing gesture foreground and background segmentation to obtain a plurality of gesture foreground images and to randomly divide the foreground images into a training set and a testing set; S2) according to the gesture model graphs, creating the actual label graphs for the gesture foreground images wherein the actual label graphs contain a plurality of coordinate points representing the various reference identification points of a person's hands in the gesture foreground images with each coordinate point value comprising the coordinate value and the depth value of the corresponding reference identification points; S3) selecting a plurality of gesture foreground images from the training set to train the full-convolution neural network and outputting a plurality of prediction label graphs correspondingly; S4) comparing the deviations between the actual label graphs and the prediction label graphs and at the same time, updating the network parameters; S5) iterating through constantly reducing the deviations until the network parameters are converged; and S6) outputting the to-be-estimated gesture depth graphs and outputting corresponding label graphs as the estimation result.

Description

technical field [0001] The invention relates to the fields of computer vision and deep learning, in particular to a method for estimating a three-dimensional gesture pose. Background technique [0002] In recent years, with the gradual popularization of virtual reality and augmented reality technology and immeasurable development prospects, as an important means of human-computer interaction, gesture recognition technology has been highly concerned by the field of computer vision. It is more complex, has a higher degree of freedom and is prone to occlusion. It has always been a difficult problem to quickly and accurately recognize gesture positions and gesture actions. [0003] Traditional gesture estimation methods can generally be divided into two types: sensor-based and image-based. Sensor-based gesture and posture estimation technology refers to fixing sensors such as accelerometers and angular velocity meters on specific parts of the palm and fingers of a person, so th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/207G06T3/00G06T7/194G06T7/215
CPCG06T7/194G06T7/207G06T7/215G06T3/147
Inventor 王好谦李达王兴政方璐张永兵戴琼海
Owner SHENZHEN INST OF FUTURE MEDIA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products