Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human action recognition method based on multi-feature motion-in-depth diagram

A recognition method and motion map technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as high redundancy, misidentification of similar behaviors, and weak ability to describe local details.

Active Publication Date: 2018-07-13
XIANGTAN UNIV
View PDF2 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] b. Based on the depth motion map, the fusion features generated by fusing the depth motion maps of the three projection directions through the direct concatenation method are not only highly redundant, but also weak in describing local detail features;
[0008] c. Only a single sparse reconstruction error is considered for the classification of test samples, which can easily cause misidentification of similar behaviors (such as running and walking)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human action recognition method based on multi-feature motion-in-depth diagram
  • Human action recognition method based on multi-feature motion-in-depth diagram
  • Human action recognition method based on multi-feature motion-in-depth diagram

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0081] The specific embodiment of the present invention is described below in conjunction with accompanying drawing, figure 1 It is a schematic flow chart of human behavior recognition based on multi-feature depth motion map in this embodiment. The invention discloses a human body behavior recognition method based on a multi-feature depth motion map. The specific implementation steps are: (1) projecting all frames of the depth video onto three orthogonal planes of front, side and top; The absolute difference of two consecutive projected frames stacked by a plane forms a Depth Motion Map (DMM). {v=f,s,t} ), and then extract LBP features, GIST features, and HOG features respectively, and form feature descriptors corresponding to the three directions; (3) perform feature fusion and dimensionality reduction on three different feature descriptors; (4) calculate various behavior samples respectively The eigenvectors of l are based on l 1 Norm and l 2 Sparse reconstruction error o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human action recognition method based on a multi-feature motion-in-depth diagram. The method specifically includes steps of (1), projecting all the frames of a depth video tofront, side and top orthogonal planes; (2), stacking the absolute pixel differences of two continuous projection frames on each plane to form the motion-in-depth diagram (DMM{v=f, s and t}), and respectively extracting LBP characteristics, GIST characteristics and HOG characteristics to form characteristic descriptors corresponding the three directions; (3), using a relative entropy principal component analysis method to perform characteristic fusion and dimensionality reduction on the three different types of the characteristic descriptors; (4), calculating the sparse reconstruction errors of the characteristics of each type of action training samples under an l1 norm and an l2 norm after the fusion and dimensionality reduction, and using an entropy weight method to perform adaptive weighted fusion on the two types of reconstruction errors to design an error-fusion-based classifier for action recognition. By the method, the accuracy of human action recognition can be improved.

Description

technical field [0001] The invention belongs to the fields of artificial intelligence and pattern recognition, and in particular relates to human body behavior recognition based on multi-feature depth motion maps. Background technique [0002] Human behavior recognition has been widely used in abnormal behavior analysis, intelligent monitoring, and home security. Human action recognition has been an active research area in the past few decades, and the research mainly focuses on RGB video image sequences. For example: because the spatio-temporal interest points have the characteristics of rich motion information, in the three-dimensional space-time space, the Harris detector is used to detect the interest points in the three-dimensional space-time space, use the interest points to describe the action and identify the behavior; by tracking the optical flow of each frame sampling point Dense trajectory features formed by information have been widely used in human action recog...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06V20/41G06F18/2135G06F18/253G06F18/214
Inventor 王冬丽欧芳周彦
Owner XIANGTAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products