Mobile equipment human body pose estimation method based on three-dimensional skeleton extraction

A mobile device, three-dimensional skeleton technology, applied in computing, computer parts, image analysis, etc., can solve the problems of missing dimension, large calculation error, poor portability, etc.

Active Publication Date: 2021-01-01
NANJING UNIV OF POSTS & TELECOMM
View PDF1 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The technical problem to be solved by the present invention is to overcome the defects of the prior art, and provide a method for estimating the human body pose of a mobile device based on three-dimensional skeleton extraction, so as to solve the problems of lack of dimensions, large calculation errors, poor portability and large amount of computation in the prior art. defect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile equipment human body pose estimation method based on three-dimensional skeleton extraction
  • Mobile equipment human body pose estimation method based on three-dimensional skeleton extraction
  • Mobile equipment human body pose estimation method based on three-dimensional skeleton extraction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066]The following embodiments are only used to illustrate the technical solutions of the present invention more clearly, and cannot be used to limit the protection scope of the present invention.

[0067]The method for estimating the human body pose of a mobile device based on three-dimensional skeleton extraction includes the following steps:

[0068]Input data collection: use mobile devices to collect real-time video of the person's body;

[0069]2D joint point acquisition: Pass the obtained human body video to the background service, and put the human body video into the lightweight human skeleton recognition model to obtain 2D human body joint points;

[0070]3D joint point estimation: map the obtained 2D human body joint points through the neural network regressor to obtain the 3D human body joint points in the three-dimensional space;

[0071]Human skeleton acquisition: Put the obtained 3D human body joint points back to the corresponding position in the human body video frame, and connect...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a mobile equipment human body pose estimation method based on three-dimensional skeleton extraction. The method comprises the steps: putting a human body video into a light-weight human body skeleton recognition model, and obtaining 2D articulation points; mapping the obtained 2D articulation points through a regression device constructed by a neural network to obtain 3D articulation points in a space; calculating an interlimb joint angle of the photographed person according to the obtained coordinate information of the 3D joint points; judging the pose of the photographed person and the photographing position of the photographing equipment according to the inter-limb joint angle of the photographed person and the position information of the 3D joint points; and mapping 2D human body articulation points obtained by the lightweight human body skeleton recognition model to 3D human body articulation points in a three-dimensional space through a deep neural networkregression device, thereby increasing the dimension of geometric information of the 2D human body articulation points. The limb included angle is calculated based on the geometrical relationship of the 3D human body articulation points, and the pose of the human body and the shooting position of the mobile equipment are effectively predicted according to the limb included angle and the position relationship of the articulation points.

Description

Technical field[0001]The invention relates to a method for estimating the human body pose of a mobile device based on three-dimensional skeleton extraction, and belongs to the technical field of video image processing.Background technique[0002]Thanks to the development of cameras and AI-assisted processors in the mobile phone industry, artificial intelligence has increasingly integrated into people's daily lives. The current human pose estimation methods used on mobile platforms are all based on the extraction of 2D joint points. Due to the lack of a dimension, certain errors or even errors in the prediction of human pose are often generated.[0003]Deep convolutional neural networks have improved the performance of computer vision tasks to new heights. The development trend of deep convolutional neural networks is to achieve higher recognition accuracy. Therefore, with the improvement of recognition accuracy, the structure of the deep convolutional neural network becomes deeper and m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/20G06T7/40G06T7/50G06T7/73G06T3/00G06K9/62G06K9/00
CPCG06T7/75G06T7/50G06T7/40G06T7/20G06V40/23G06F18/214G06T3/06
Inventor 高浩李奕徐枫宗睿余新光潘隆盛凌至培
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products