Human behavior recognition method and device based on depth camera and basic posture

A technology of depth camera and recognition method, which is applied in the field of human-computer interaction, can solve problems such as slow training speed, easy to be affected by environmental background, and time difference in completion, so as to improve accuracy and reliability, ensure the invariance of viewing angle, remove The effect of noise action

Active Publication Date: 2020-12-08
TSINGHUA UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, good results have not been achieved for video human behavior recognition. There are currently two mainstream methods. One is based on RGB images. In addition to using the two-dimensional information of the picture, plus the time scale, the 3D convolutional neural network is used. method to identify objects, but this type of method has the disadvantages of slow training speed and easy to be affected by the environment background, and the effect is not very ideal; another method, the main idea focuses on dimensionality reduction, and the tool used is the depth camera, because the depth camera It can capture depth information, and can better describe the spatial movement information of the human body, because the main action part is still the human body in video-based human behavior recognition. In this category, the person is first positioned, and then the key points of the person are partly Extraction, the joint movement of the human body is basically determined by the key points of the body, so the information of the key points moving with time series can ideally describe the human body's movements as sequence information
[0004] However, there are still great difficulties in understanding human behavior in videos. One of them is that the angle of view may change with the shooting position or the angle of the human body relative to the camera. For example, a kind of walking may have different angles. These are for the training set. Necessary, this creates a huge database problem, and for each movement, different people will have individual differences in the time to complete

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human behavior recognition method and device based on depth camera and basic posture
  • Human behavior recognition method and device based on depth camera and basic posture
  • Human behavior recognition method and device based on depth camera and basic posture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Embodiments of the present invention are described in detail below, and examples of the embodiments are shown in the drawings, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0034]The following describes the human behavior recognition method and device based on the depth camera and basic postures according to the embodiments of the present invention with reference to the accompanying drawings. First, the human behavior recognition method based on the depth camera and basic postures according to the embodiments of the present invention will be described with reference to the accompanying drawings .

[0035] figure 1 It is a flow chart of a human behavior recognition method based on a depth...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body behavior recognition method and device based on a depth camera and basic postures, wherein the method includes: detecting images of human body behaviors through a depth camera, and extracting key point information; reconstructing a three-dimensional structure of a human body according to the input image ;Reconstruct the 3D data of the human body to identify the key poses and extract the corresponding features to obtain the time sequence of the pose feature description set; match it with the pre-trained feature description set to obtain the initial time description sequence of the basic pose; remove the initial time Describe the repeated information of the sequence to obtain the final time description sequence; input the final time description sequence into the pre-trained long-short-term memory network for human behavior recognition to obtain the recognition result. This method can reduce the interference caused by different speeds and non-standard movements of the recognition target, keep the recognition perspective unchanged, make it easier to build a database, and improve the robustness and accuracy of human behavior recognition.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular to a human behavior recognition method and device based on a depth camera and basic gestures. Background technique [0002] With the development of computer vision technology and the emergence of convolutional neural networks, the accuracy and speed of computers for static recognition of multiple types of objects have been continuously improved, and computers have a certain ability to understand the natural world. At the same time, for sequence tasks, including some recognition problems with obvious time scales such as speech recognition technology and language translation, the long short-term memory network has achieved good results, because it can store historical information in the cell unit, and over time With the increase of the scale, the network structure still preserves a relatively complete historical memory, so as to achieve a good recognition ef...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/50G06T17/00G06K9/00G06K9/46G06K9/62
CPCG06T7/50G06T17/00G06V40/20G06V10/462G06F18/23
Inventor 陈峰孙鹏飞王贵锦
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products