Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human action recognition method

A technology of human motion recognition and motion, applied in the field of computer vision and pattern recognition, can solve the problems of complex calculation and time-consuming, etc., and achieve the effect of improving the recognition rate, improving the ability to resist noise, and improving the motion recognition rate

Inactive Publication Date: 2017-05-10
SHANGHAI NORMAL UNIVERSITY
View PDF1 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Experts generally need about 60 joint parameters of the human body model to achieve accurate estimation of human motion, but the calculation of optimal parameter estimation in a parameter space exceeding 60 dimensions is very complicated and takes a lot of time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human action recognition method
  • Human action recognition method
  • Human action recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0047] Such as figure 1 As shown, the present embodiment is a human action recognition method, including the following steps.

[0048] Step 1: Obtain the video of the required human motion recognition feature data;

[0049] Specifically, the MSR Action3D action database can be used as experimental data; MSR Action3D is a public dataset that provides depth maps and skeleton sequences captured by an RGBD camera. It consists of 20 actions performed by 10 people facing the camera. Perform each movement two to three times. The depth map has a pixel size of 320x240. In order to analyze the results of action recognition more clearly, the action data set is divided into three parts of the experiment, and 18 action classes are selected from the 20 action data and divided into three groups for experiments. The 18 actions are defined as Action1 to Action18.

[0050] Step 2: Based on the traditional cumulative motion energy improved frame selection algorithm, select effective image fr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human action recognition method. The method comprises the following steps: S1, continuous image frames comprising human action recognition feature data are acquired; S2, an effective image frame is screened, and feature data in the effective image frame serve as to-be-detected data; S3, a static posture feature vector Fcc, a continuous action feature vector Fcp and an overall action feature vector Fco of the to-be-detected data are built as action features; S4, a final feature vector Fc is built, and Fc=[Fcc, Fcp, Fco]; S5, the final feature vector Fc is subjected to dimension reduction; and S6, a well-trained model classifier is used for action recognition on the final feature vector Fc after dimension reduction, and a recognition result is obtained. Compared with the prior art, the method of the invention has the advantages of high recognition rate and small calculation amount.

Description

technical field [0001] The invention relates to the fields of computer vision and pattern recognition, in particular to a human body action recognition method. Background technique [0002] In daily life, it is very simple to recognize human body movements with the naked eye. However, automatic classification of human actions through computer vision systems is a very complex and challenging task. Among them, there are many problems to be solved, such as the capture of human motion information, the learning of training samples, the recognition of small changes in similar motion patterns in time and space, and the acquisition of human behavioral intentions. At present, human motion recognition technology is not mature enough, and there is still a long way to go. The main difficulties faced by this research are: [0003] (1) Human body structure and movement [0004] The human body is a complex organism composed of a series of skeletal joints, and its movements are composed ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/103G06F18/2411
Inventor 张相芬刘絮雨房博文马燕李传江
Owner SHANGHAI NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products