Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Human skeleton behavior identification method, system and device based on graph convolutional network

A technology of convolutional network and human skeleton, applied in the field of computer vision and deep learning, can solve the problem of low precision

Active Publication Date: 2019-09-10
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF3 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above-mentioned problems in the prior art, that is, the problem that the accuracy of human skeleton behavior recognition results based on graph convolutional neural network is not high, the present invention provides a human skeleton behavior recognition method based on graph convolutional network, including:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human skeleton behavior identification method, system and device based on graph convolutional network
  • Human skeleton behavior identification method, system and device based on graph convolutional network
  • Human skeleton behavior identification method, system and device based on graph convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0064] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0065] A human skeleton behavior recognition method based on graph convolution network of the present invention, comprising:

[0066] Step S10, obtaining the preset video frame in the skeleton video and performing normalization processing as the skeleton sequence to be recognized; ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision and deep learning, particularly relates to a human skeleton behavior recognition method, system and device based on a graph convolutional neural network, which aims to solve the problem that the human skeleton behavior recognition result based on the graph convolutional neural network is low in precision. The method comprises the following steps of acquiring a skeleton video frame and normalizing the skeleton video frame; constructing a human joint natural connection graph corresponding to each frame of graph; learning the unnatural connection edge to obtain a human body joint connection diagram; allocating a weight value to each edge of the human body joint connection diagram; carrying out graph convolution operation to obtain the spatial information of the skeleton sequence; and carrying out convolution operation on the time dimension to obtain the behavior category of the skeleton sequence. According to the method, the natural connection edge can learn the basic human body behavior characteristics, the non-natural connection edge can learn the additional behavior characteristics, a graph is formed by the natural connection edge and the non-natural connection edge together, the human body motion information can be more fully represented, and the recognition performance is improved.

Description

technical field [0001] The invention belongs to the field of computer vision and deep learning, and in particular relates to a human skeleton behavior recognition method, system and device based on a graph convolutional network. Background technique [0002] As an important research field of computer vision, behavior recognition aims to distinguish the category of human behavior from a given video. Behavior recognition is widely used and has important research value in many fields such as smart home, motion analysis, video surveillance, and human-computer interaction. The current behavior recognition methods are mainly researched from two perspectives based on RGB video and based on human skeleton joints. The method based on RGB video is easily affected by illumination and occlusion, and has poor robustness, while the method based on human bone joint points is extremely discriminative, unaffected by illumination, and relatively sensitive to viewing angle transformation and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/23G06N3/045G06F18/2411
Inventor 原春锋吕红杰李兵段运强胡卫明刘雨帆
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products