Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

High-accuracy human body multi-position identification method based on convolutional neural network

A convolutional neural network and recognition method technology, applied in the field of multi-part recognition of human body based on convolutional neural network, to achieve the effects of accurate positioning, improved recognition accuracy, and improved accuracy

Inactive Publication Date: 2016-07-06
BEIJING UNIV OF TECH
View PDF2 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

How to establish an association model of the whole and each part that can maintain a high accuracy rate with changes in action posture and viewing angle is called the top priority. In the past methods, geometric constraint algorithms such as Poselets and DPM have achieved good results. However, most of these results are based on the given overall frame to locate the position of each part, so the above methods also need further improvement

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-accuracy human body multi-position identification method based on convolutional neural network
  • High-accuracy human body multi-position identification method based on convolutional neural network
  • High-accuracy human body multi-position identification method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to make the purpose, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and formulas.

[0022] figure 2 It is a structural diagram of the convolutional neural network in the overall human body multi-part recognition model of the present invention. It adopts the deep convolutional neural network structure of Krizhevsky et al. The input is a 224*224 RGB image, and then passes through five layers of convolutional layers , three-layer pooling layer and three-layer fully connected layer finally get the classification result of the candidate area. Where the activation function is determined by the previous f(x)=tanh(x) or f(x)=(1+e -x ) -1 It becomes f(x)=max(0,x), which greatly improves the calculation speed of the convolutional neural network. Local response normalization uses b x ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a high-accuracy human body multi-position identification method based on a convolutional neural network. A deep convolutional neural network is adopted to extract image characteristics. The method fully utilizes the depth information of an image to greatly improve the identification accuracy rate of the image; secondly, on the basis of a RCNN (Region Convolutional Neural Network) method, a Selective Search algorithm is used to form a candidate border so as to better adapt to the accuracy of positioning information under the deep convolutional neural network than a sliding window method; furthermore, the last layer Softmax layer of the convolutional neural network is replaced with a SVM (Support Vector Machine) to finally obtain a classification-based score; in addition, after the SVM score, which is relative to each category, of each candidate boarder is obtained, a pixel-based position range constraint, a K neighbor constraint and a Gaussian mixture model are added to finally form a candidate border combination based on general comprehension, the accuracy of the human body multi-position identification is improved, and the high-accuracy human body multi-position identification method can carry out more accurate positioning than an original RCNN method.

Description

technical field [0001] The invention relates to the field of pattern recognition and deep learning, in particular to a high-accuracy multi-part human body recognition method based on a convolutional neural network. Background technique [0002] Human recognition is a hot topic in computer vision. In the past, human recognition was mostly based on low-level feature factors and high-level context. The SIFT operator and the HOG operator are commonly used operators. They are obtained based on the low-level orientation histogram, but the features of the image are layered and advanced layer by layer. Therefore, LeCun et al. Based on the research of Rumelhart et al., the training method of convolutional neural network (CNN) based on stochastic gradient descent algorithm based on backpropagation was proposed, which received considerable attention at that time and formed a new cognition in the field of computer vision. [0003] Although CNN was widely used in the 1990s, it was late...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/2411G06F18/214
Inventor 刘波张恒瑜
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products