Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body gesture identification method based on depth convolution neural network

A neural network and human body posture technology, applied in the field of human body posture estimation system, can solve the problems of artificially designed image features and insufficient accuracy of spatial model posture estimation, so as to save space and time overhead, avoid limitations, and achieve high accuracy Effect

Inactive Publication Date: 2015-11-18
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF5 Cites 87 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0013] In order to overcome the lack of pose estimation accuracy caused by artificial design of image features and spatial models in traditional pose estimation methods, the present invention studies how to obtain higher pose accuracy on the premise of ensuring pose estimation speed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body gesture identification method based on depth convolution neural network
  • Human body gesture identification method based on depth convolution neural network
  • Human body gesture identification method based on depth convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to solve the above problems, the specific implementation steps of the human body posture estimation system based on deep convolutional neural network proposed by the present invention are as follows:

[0034]Step 1: Preprocessing. Data augmentation plays a crucial role in the training of deep convolutional neural networks. In the model training stage, the present invention aims at the problem of pose estimation. The data enhancement method adopted is: through rotation, translation, scale transformation, etc., the training samples are enhanced to force the model to learn the robustness of rotation, translation, and scale transformation. sexual characteristics. At the same time, these operations also provide a large number of fake samples for model training. In the model running stage, it is only necessary to scale the input image to adapt to the input layer size of the deep convolutional neural network, and record the corresponding relationship between the pix...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body gesture identification method based on a depth convolution neural network, belongs to the technical filed of mode identification and information processing, relates to behavior identification tasks in the aspect of computer vision, and in particular relates to a human body gesture estimation system research and implementation scheme based on the depth convolution neural network. The neural network comprises independent output layers and independent loss functions, wherein the independent output layers and the independent loss functions are designed for positioning human body joints. ILPN consists of an input layer, seven hidden layers and two independent output layers. The hidden layers from the first to the sixth are convolution layers, and are used for feature extraction. The seventh hidden layer (fc7) is a full connection layer. The output layers consist of two independent parts of fc8-x and fc8-y. The fc8-x is used for predicting the x coordinate of a joint. The fc8-y is used for predicting the y coordinate of the joint. When model training is carried out, each output is provided with an independent softmax loss function to guide the learning of a model. The human body gesture identification method has the advantages of simple and fast training, small computation amount and high accuracy.

Description

technical field [0001] The invention belongs to the technical field of pattern recognition and information processing, and relates to the behavior recognition task of computer vision, in particular to the research and realization scheme of a human body pose estimation system based on a deep convolutional neural network. Background technique [0002] Human pose estimation refers to the process of locating the positions of human joints or parts of the human body in images. It is a key problem in computer vision and a fundamental technique for image-based action recognition. Human pose estimation can be used in visual surveillance systems, human body segmentation, robot control, somatosensory games and other fields. The difficulty of human pose estimation lies in: the body joints are small and difficult to detect; the body itself is seriously occluded, the appearance changes greatly, and in-plane and out-plane rotations cause large visual changes. The mainstream methods of po...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/10
Inventor 董乐张宁
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products