Moving posture monitoring and guiding method and device based on multiple cameras

A motion posture and multi-camera technology, applied in the field of image processing, can solve problems such as inability to adapt to fixed and mechanical movements, inability to remotely synchronize fitness with professional coaches, inability to accurately display three-dimensional space movement postures, etc.

Pending Publication Date: 2020-11-03
董秀园
View PDF0 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Most of the existing technologies are based on single video or two-dimensional motion video information captured by a single camera for detection, but this method cannot accurately display the action posture of a person in three-dimensional space, resulting in the inability to give scientific guidance
In addition, this method of relying on video mostly needs to be compared with a certain standard action, and then it is impossible to choose your favorite professional trainer for remote synchronous fitness for different groups of people, or choose a fitness system that suits your current situation
For example, for beginners with little fitness experience, or athletes who are sick, they cannot adapt to fixed and mechanical movements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Moving posture monitoring and guiding method and device based on multiple cameras
  • Moving posture monitoring and guiding method and device based on multiple cameras
  • Moving posture monitoring and guiding method and device based on multiple cameras

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided to enable those skilled in the art to more thoroughly understand the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art. Nothing in the following detailed description is intended to suggest that any particular component, feature, or step is essential to the invention. Those skilled in the art will understand that various features or steps can be substituted for each other or combined without departing from the scope of the present disclosure.

[0031] figure 1 A flow chart of a multi-camera-based motion posture monitoring...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a moving posture monitoring and guiding method and device based on multiple cameras. The method comprises the following steps: acquiring moving images and / or video data of atleast one target sporter through a plurality of cameras; recognizing a target sporter from the moving image and / or the video data through a posture recognition algorithm, and outputting a human body posture graph required by moving posture monitoring; reconstructing human body three-dimensional posture information of the target sporter through a three-dimensional reconstruction algorithm based onthe human body posture diagram; carrying out skeleton registration on the target sporter and the reference person by utilizing the posture key nodes in the three-dimensional space; comparing the humanbody three-dimensional posture information of the target exerciser after bone registration with the human body three-dimensional posture information of the reference exerciser at a certain moment orwithin a preset time period; evaluating the action completion degree and quality of the target sporter based on the comparison result; and providing feedback to the target sporter based on the evaluation result, the feedback including whether the action reaches the standard and / or a motion optimization suggestion.

Description

technical field [0001] The present disclosure generally relates to the field of image processing, and in particular relates to a multi-camera-based motion attitude monitoring and guidance method and device. Background technique [0002] In recent years, with the continuous improvement of the national income level and health awareness, more and more people participate in sports to improve their physical fitness. Traditional sports guidance methods often rely on the observation of sports coaches, and then the coaches give targeted guidance programs. However, at this stage, the level of sports coaches is uneven, and the number of professional coaches is in short supply and the price is high, which cannot meet the growing needs of low-consumption groups for fitness rehabilitation and sports. However, without hiring a professional coach, many sports beginners may get injured due to wrong exercises, or cannot know how to improve their sports effects due to lack of guidance, or ca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06N3/04G06N3/08G06T17/00G16H20/30
CPCG06T17/00G06N3/084G16H20/30G06V40/23G06N3/045
Inventor 董秀园
Owner 董秀园
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products