Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Human body skeleton-based action recognition method

A human skeleton, action recognition technology, applied in the field of pattern recognition and human-computer interaction, can solve the problem of inability to measure the similarity of actions and actions, matching information (unreasonable eigenvalues, increasing complexity and difficulty, etc.)

Active Publication Date: 2016-09-07
NANJING HUAJIE IMI TECH CO LTD
View PDF6 Cites 75 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method also has the following obvious deficiencies: First, this method first needs to normalize the 3D human skeleton model, normalize the height of the human body to 1, and adjust the position of the human body to the distance from the camera and the setting in the action library. The distances are consistent, and then adjust the position of each joint point of the human body and the length information of the limbs. On the one hand, this makes the calculation of normalization very large, because each frame of bone data and all relevant nodes have to be processed, and on the other hand, according to the human body There are big problems in the scientificity and accuracy of the method of adjusting the position of the joint point and the length of the limb by the distance from the camera; the second is that the matching information (eigenvalue) used by this method is unreasonable, and the length of the node connection and the 2D length are used as the One of the matching degree metrics, this value cannot be effectively used as the feature value of the action; the third is that the matching degree algorithm used in this method is too simple, and the difference between the measurement values ​​of the fixed number of frames is calculated and used as the matching degree of action recognition. This method of calculating the matching degree cannot measure the similarity of actions when the action execution speed is different, so it has no practical application value
Although this method can recognize some actions, it also has obvious deficiencies: First, the method is based on the depth map for action recognition, so that the accuracy of method recognition depends largely on the quality of the depth map, and it will also be affected by external The impact of the environment, the second is that this method requires the support of complex pattern recognition algorithms, and the training set requires a large amount of offline training, which is difficult to implement. The third is that the real-time performance of the method is not good, and the recognition results have a large delay
This method has the following deficiencies: First, the 3D coordinates of the human skeleton used are confirmed and extracted by a single-pixel object recognition method based on a random decision forest classifier, that is, this method needs to first establish a method that can obtain the human body from the depth image. The classifier learning algorithm for bone data will undoubtedly increase the complexity and difficulty of its implementation, and the quality of identifying bone coordinate data will bring uncertainty to the recognition accuracy of the entire method, and will also cause real-time failure of the method. The second is that the method only uses the coordinate data of the skeleton, and the action features are single; the third is that the method finally needs to build a human behavior recognition model, which is based on the training samples of specific actions. Training modeling, which makes the method less applicable and scalable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body skeleton-based action recognition method
  • Human body skeleton-based action recognition method
  • Human body skeleton-based action recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] The specific implementation manners of the present invention will be further described in detail below in conjunction with the drawings and examples.

[0054] combine figure 1 , a kind of motion recognition method based on human skeleton that the present invention proposes, comprises following concrete steps:

[0055] Step 1, acquiring the continuous skeleton data frame sequence of the person performing the target action from the somatosensory device: the somatosensory device refers to a collection device that can at least obtain 3D space position information and angle information of each joint point including the human skeleton; The human skeleton data includes the data of human joint points provided by the acquisition device;

[0056] Step 2, selecting the main joint point data that can represent the action from the skeleton data: the main joint point data that characterizes the action is the joint point data that plays a key role in action recognition;

[0057] Ste...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human body skeleton-based action recognition method. The method is characterized by comprising the following basic steps that: step 1, a continuous skeleton data frame sequence of a person who is executing target actions is obtained from a somatosensory device; step 2, main joint point data which can characterize the actions are screened out from the skeleton data; step 3, action feature values are extracted from the main joint point data and are calculated, and a feature vector sequence of the actions is constructed; step 4, the feature vectors are preprocessed; step 4, the feature vector sequence of an action sample set is saved as an action sample template library; step 6, actions are acquired in real time, the distance value of the feature vector sequence of the actions and the feature vector sequence of all action samples in the template library is calculated by using a dynamic time warping algorithm; and step 7, the actions are classified and recognized. The method of the invention has the advantages of high real-time performance, high robustness, high accuracy and simple and reliable implementation, and is suitable for a real-time action recognition system.

Description

technical field [0001] The invention belongs to the technical field of pattern recognition and human-computer interaction, and in particular relates to an action recognition method based on a human skeleton. Background technique [0002] With the development of computer vision and human-computer interaction technology, more and more human-computer interaction systems choose to use human body gestures or actions as input. It is becoming more common to use actions as inputs to control systems. However, due to various reasons such as differences in human bodies, the diversity of action execution, and the complexity of actions, it has become a very difficult task to recognize human actions in a real-time, stable, and accurate manner. [0003] Chinese patent application CN201110046975.4 discloses "a method for realizing real-scene games based on action decomposition and behavior analysis". The action recognition involved in this method is to combine the obtained normalized human...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
CPCG06V40/23
Inventor 王行周晓军李骊盛赞
Owner NANJING HUAJIE IMI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products