Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human pose estimation based on directional image fusion

A pose estimation and direction map technology, applied in the fields of image processing and computer vision, can solve problems such as difficult identification of key points, slow down training speed, occluded pose estimation, etc., achieve easy convergence of training, better robustness, and improve prediction The effect of precision

Inactive Publication Date: 2018-12-18
SOUTHEAST UNIV
View PDF2 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method has certain robustness, but it is difficult to detect the key points of the part that are occluded or the key points of the part that tend to be consistent with the background color.
[0004] To perform human pose estimation, many technical problems need to be solved: (1) The occlusion problem is the most difficult problem in pose estimation
In the existing methods, it is basically to directly detect the position heat map, let the deep convolutional neural network implicitly learn the relationship between the key parts of the human body, which makes it difficult to detect the occluded human body parts, and also slows down the training speed
(2) Fusion of the foreground image and the background image, that is, the color of some key point areas of the human body is basically the same as that of the adjacent area in the background image, which leads to the inability to accurately estimate the key point position during key point detection, resulting in a large detection error
(3) It is difficult to distinguish between key points
For example, the left and right feet are indistinguishable in a sideways position

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human pose estimation based on directional image fusion
  • Human pose estimation based on directional image fusion
  • Human pose estimation based on directional image fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The technical solutions provided by the present invention will be described in detail below in conjunction with specific examples. It should be understood that the following specific embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention.

[0021] The method for estimating the human body posture by fusing the direction diagram provided by the present invention uses a deep convolutional network to realize end-to-end learning. The network structure is as follows: figure 1 shown. First use a simple convolutional network to preliminarily process the input RGB image to obtain basic features. Afterwards, higher-level features are obtained through an hourglass network module. Then the position heat map and the direction heat map are obtained through the position network and the direction network respectively. The fusion network in the latter stage fuses the position heat map, direction map and previous h...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human pose estimation based on directional image fusion, which comprises the following steps: extracting features to obtain high-level features; performing preliminary learning detection of key point location map and detection of directional image to obtaina position heat map and a directional heat map. The predicted position heat map, direction map and high-level features are fused to get more accurate position heat map. By fusing the direction information, the invention improves the prediction precision of the key points, and compared with the previous network structure, the proposed depth convolution network can obtain more human posture information. The invention can effectively solve the attitude estimation under the occlusion problem. For the occluded key points, when the position heat map is basically correct, even if the position network of the first stage cannot find the key point position well, the back fusion network can accurately predict the keypoint position, and the robustness is better. The network structure parameters of this method are much smaller, and the requirement of display memory is not high in the training phase, so the trainingis easier to converge.

Description

technical field [0001] The invention belongs to the technical fields of computer vision and image processing, and relates to a human body posture estimation technology, in particular to a human body posture estimation method for merging direction diagrams. Background technique [0002] With the popularity of smart phones, tablet computers and other devices, images and video data for human body activities are generated all the time. How to make computers automatically understand human body movements is also very important. Its application fields will be very wide, such as human computer interaction, intelligent monitoring, etc. Computers can efficiently and automatically understand human movements, which will have a profound impact on the entire society. In this context, human pose estimation was proposed. The purpose of human pose estimation is to detect the pose behavior of the human body, and to obtain digital and presentable pose information through computer learning an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06T7/207G06N3/04
CPCG06T7/207G06T2207/10024G06T2207/20221G06V40/20G06N3/045
Inventor 庄文林王雁刚夏思宇
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products