Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-angle indoor human action recognition method based on 3D skeleton

A recognition method and multi-view technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problem of not analyzing the rotation of skeleton joint points, and achieve the effect of overcoming limitations

Inactive Publication Date: 2016-06-01
WUHAN INSTITUTE OF TECHNOLOGY
View PDF5 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Shotton et al. proposed an algorithm that uses depth images to obtain 3D skeleton positions, but only simple joint points represent the skeleton, and do not analyze information such as skeleton joint point rotation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-angle indoor human action recognition method based on 3D skeleton
  • Multi-angle indoor human action recognition method based on 3D skeleton
  • Multi-angle indoor human action recognition method based on 3D skeleton

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0019] like figure 1 and figure 2 As shown, the multi-view indoor human behavior recognition method based on 3D skeleton includes the following steps:

[0020] 1) Acquire videos of human body movement at three angles of facing the camera -10°~10°, right facing the camera 20°~70° and left facing the camera -20°~-70°; the videos include training videos and test videos ; In this embodiment, we use the somatosensory device kinect to collect video data, collect RGB and Depth data, generate a video file in ONI format, and calculate the three-dimensional coordinate data and confidence of bone joint points at the same time, and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-angle indoor human action recognition method based on a 3D skeleton. The multi-angle indoor human action recognition method comprises the following steps that 1) videos of human motions at three angles of a front angle, a squint angle and a side angle are acquired; and the videos include training videos and test videos; 2) human skeleton 3D features in the videos are extracted through body feeling equipment; and the three-dimensional skeleton features include global motion features and arm and leg local motion features; 3) model training is performed; feature description is performed through the human skeleton 3D features in the training videos so that a training feature set is obtained; and the concrete process is listed as follows: online dictionary learning of the three-dimensional skeleton features is performed; and then dimension reduction is performed through sparse principal component analysis so that a feature set data set is formed; and 4) the feature set of the samples of the test videos is inputted, and recognition is performed through a linear support vector machine (LSVM). Classified recognition of the multi-angle motions is realized by the method so that the limitation of a single-angle recognition algorithm can be overcome and thus the method has more research value and actual application value.

Description

technical field [0001] The invention relates to the field of human behavior recognition, in particular to a multi-view indoor human behavior recognition method based on a 3D skeleton. Background technique [0002] In recent years, action recognition has become a research hotspot in the fields of computer vision and pattern recognition. Behavior recognition technology is not only used in many fields such as video surveillance systems, smart homes, sports operation analysis, human-computer interaction, film and television action production, and medical rehabilitation, but also creates huge social and economic benefits. According to different behavior description methods, the existing algorithms can be divided into two categories: one is the method based on the appearance, and the other is the method based on the human body model. and so on to describe the behavior; while the method based on the human body model uses the human body model to obtain the structural characteristic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/23G06F18/2411
Inventor 鲁统伟彭玲缪少君刘文婷张彦铎李晓林卢涛闵锋李迅周华兵朱锐
Owner WUHAN INSTITUTE OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products