Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human behavior identification method based on depth information

A recognition method and technology of depth information, applied in the field of computer vision and image processing, can solve the problem of lack of spatial structure information

Inactive Publication Date: 2016-08-31
CIVIL AVIATION UNIV OF CHINA
View PDF5 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method makes up for the shortcoming of complete loss of depth information to a certain extent, and expresses actions more perfectly, the 3D representation mentioned in the action expression scheme only includes the change direction of depth, and lacks more specific spatial structure information.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human behavior identification method based on depth information
  • Human behavior identification method based on depth information
  • Human behavior identification method based on depth information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The human behavior recognition method based on depth information provided by the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0026] The human behavior recognition method based on depth information provided by the present invention includes the following steps in order:

[0027] (1) if figure 1 As shown, the depth camera is used to collect a series of depth images of several people performing multiple different actions in the same background, and one of the actions of one person corresponds to multiple frames of depth images, and then these depth images are transmitted to the depth images connected to the depth camera. Computer, because there are backgrounds other than the human body in the depth image collected, therefore extract the depth image of the human body region as the foreground from each frame of depth image collected on the computer; by setting the depth value threshold (0-2500m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human behavior identification method based on the depth information. The method comprises steps that depth images are acquired by utilizing a depth camera; a two-dimensional depth image coordinate system is converted to a three-dimensional camera coordinate system; X, Y and Z values of a three-dimensional point under the three-dimensional camera coordinate system are assigned to X, Y and Z values of a point cloud three-dimensional point; multiple frame images of each motion which are converted into three-dimensional coordinates fill a three-dimensional body in a point cloud storage format, and a three-dimensional human motion history body is acquired; the three-dimensional human motion history body is divided into identification and training samples; a word frequency distribution histogram and a class statistics histogram of the training sample are acquired, and the acquired histograms are inputted to a SVM training determination motion classifier model; a word frequency distribution histogram of the identification sample is inputted to a classifier model to carry out identification operation, and thereby the identification result is acquired. According to the method, problems that depth value utilization is not enough and the specific space structure information is insufficient are solved through the generated three-dimensional human motion history body.

Description

technical field [0001] The invention belongs to the technical field of computer vision and image processing, in particular to a behavior recognition method based on depth information. Background technique [0002] With the rapid development of computer vision technology, the need for video-based human behavior recognition is becoming more and more urgent. In such systems, behavior recognition plays an increasingly important role. [0003] Most of the early human behavior recognition was carried out using ordinary color (RGB) image sequences, and color image recognition mostly used appearance features. As technology has improved, depth cameras equipped with depth sensors have appeared in recent years. This camera depth head can obtain depth images with acceptable quality while acquiring normal RGB images. Compared with color images, depth images have the following characteristics in behavior recognition: first, depth data is only related to the spatial position of the obje...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/647G06V40/23G06F18/2411
Inventor 张良刘文评
Owner CIVIL AVIATION UNIV OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products