Three-dimensional gesture attitude estimation method
A posture estimation and three-dimensional gesture technology, applied in the field of computer vision and deep learning, can solve the problems of low recognition accuracy, many convolutional neural network training parameters, slow training and testing speed, etc., to achieve accurate gesture recognition and improve training speed , the effect of parameter reduction
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0033] The present invention will be further described below in conjunction with the accompanying drawings and preferred embodiments.
[0034] The specific embodiment of the present invention proposes a method for estimating a three-dimensional gesture posture, which includes the following steps S1 to S6:
[0035] S1. Obtain multiple gesture depth maps, and segment the gesture foreground and background of each gesture depth map to obtain multiple gesture foreground maps and randomly divide them into a training set and a test set. The specific process of step S1 includes: using a depth camera to take multiple gesture depth maps of different people and performing affine transformation processing so that the multiple gesture depth maps have the same size; using a random forest classifier to perform gesture foreground and background The segmentation of multiple gesture foreground images is obtained; the multiple gesture foreground images are randomly divided into a large number of...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com