Human skeleton joint point behavior motion expression method based on energy function

A technology of human skeleton and energy function, which is applied in the direction of instruments, character and pattern recognition, computer components, etc., can solve the problems of misjudgment as a static state, the expression method is not intuitive enough, and the expression method is difficult to explain intuitively, so as to improve the accuracy Effects on Sex and Reliability

Inactive Publication Date: 2016-08-31
UNIV OF SHANGHAI FOR SCI & TECH
View PDF1 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the behavior representation methods based on the human skeleton mainly include: using the skeleton structure with joint angles, the trajectory of a single joint point, the velocity and other features to represent the position, motion and trajectory of human actions, but these representation methods are difficult to intuitively Explain, and different people do the same action differently, so the trajectory of the joint points is not the same
[0003] After searching the existing literature, Ofli et al. proposed in the article Sequence of the most informative joints (SMIJ): A new representation for human skeletal action recognition published by J.Vis.Commun.Image R that the human body performs the same action When the number and order of joint points are consistent, and entropy is used to quantitatively describe the information of human motion, but the use of this motion representation method is defective in some specific motions, such as when the human body is rotating, Since the change in joint angle is small, the change in entropy is small, which is misjudged as a stationary state
And from the perspective of human perceptual knowledge, such a representation method is not intuitive enough, and it is not easy to be directly used for the division of video sequences

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human skeleton joint point behavior motion expression method based on energy function
  • Human skeleton joint point behavior motion expression method based on energy function
  • Human skeleton joint point behavior motion expression method based on energy function

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The method first obtains the position information of the joint points of the human skeleton through the video equipment, and then calculates the kinetic energy and potential energy of each joint point of the human skeleton and the interaction potential energy information of the characters to quantitatively represent the action characteristics of the person. Finally, this representation method is applied to the sub-action division of long videos, so as to obtain sub-action video sequences with complete action meanings.

[0026] In this embodiment, we first use the above method to test the simple actions of the Microsoft Research Cambridge-12 (MSRC-12) action data set of Microsoft Research Cambridge, and then conduct a test experiment on the Dataset-1200 (CAD-120) data set of Cornell University. complex actions, verifying the effectiveness of our segmentation method for complex action recognition.

[0027] This embodiment includes the following steps:

[0028] The first ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a human skeleton joint point behavior motion expression method based on the energy function. According to the method, firstly, the position information of human skeleton joint points is acquired through the video equipment; secondly, the kinetic energy and the potential energy of each human skeleton joint point and the interaction potential energy information of a figure are calculated, and human motion characteristics are expressed quantitatively; human behavior video sequence division is carried out through adjacent frame merging, adjacent frame merging is carried out on the basis of energy similarity of two adjacent frames, energy values of the two adjacent frames are calculated through human motion characteristics for comparison, when the similarity is smaller than the similarity threshold, the two frames of pictures belong to one division group, or the two frames of pictures belong to different motion division segments, and thereby multiple integral-motion-meaning sub motion video sequences are acquired. Through the method, video division accuracy and reliability can be greatly improved, and the method can be further applied to multiple aspects such as human motion identification and key frame extraction.

Description

technical field [0001] The invention relates to an image information processing technology, in particular to an energy function-based behavior expression method of human skeleton joint points. Background technique [0002] With the widespread use of video equipment and 3D cameras, the recognition of human behavior has become an important research topic in the field of computer vision. Appropriately representing human behavior is a tricky and important task, which has special significance for monitoring, intelligent robots, human-computer interaction, etc. It is the basis and key step for accurate recognition and understanding of human behavior. At present, the behavior representation methods based on the human skeleton mainly include: using the skeleton structure with joint angles, the trajectory of a single joint point, the velocity and other features to represent the position, motion and trajectory of human actions, but these representation methods are difficult to intuiti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00
CPCG06V40/23G06V20/42
Inventor 王永雄曾艳魏国亮宋燕李璇刘嘉莹
Owner UNIV OF SHANGHAI FOR SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products