Hierarchical dance movement posture estimation method based on sequence multi-scale depth feature fusion
A pose estimation and depth feature technology, applied in the field of computer vision, can solve problems such as difficult to accurately estimate dancer's movement changes, difficult to detect, and low accuracy of dance pose estimation, so as to improve the effect, improve robustness, and improve accuracy. estimated effect
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0047] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.
[0048] Please refer to the attached figure 1 , The present invention provides a hierarchical dance pose estimation method based on sequential multi-scale deep feature fusion, the method is based on YOLOv3 human body frame detection, sequential multi-scale feature fusion, and hierarchical real-time pose estimation based on the geometric relationship of joint points. The present invention adopts the top-down framework, first uses YOLOv3 to detect the dancer's bo...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com