Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Human skeleton behavior recognition method, system, and device based on graph convolutional network

A recognition method and human skeleton technology, applied in the field of computer vision and deep learning, can solve the problem of low accuracy

Active Publication Date: 2021-03-02
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above-mentioned problems in the prior art, that is, the problem that the accuracy of human skeleton behavior recognition results based on graph convolutional neural network is not high, the present invention provides a human skeleton behavior recognition method based on graph convolutional network, including:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human skeleton behavior recognition method, system, and device based on graph convolutional network
  • Human skeleton behavior recognition method, system, and device based on graph convolutional network
  • Human skeleton behavior recognition method, system, and device based on graph convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0064] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0065] A human skeleton behavior recognition method based on graph convolution network of the present invention, comprising:

[0066] Step S10, obtaining the preset video frame in the skeleton video and performing normalization processing as the skeleton sequence to be recognized; ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision and deep learning, and specifically relates to a human skeleton behavior recognition method, system, and device based on a graph convolutional network, aiming to solve the problem that the accuracy of human skeleton behavior recognition results based on a graph convolutional neural network is not high question. The method of the present invention includes: obtaining the skeleton video frame and normalizing it; constructing a natural connection diagram of human body joints corresponding to each frame diagram; learning unnatural connection edges to obtain a human body joint connection diagram; assigning weight values ​​to each edge of the human body joint connection diagram ; Perform graph convolution operation to obtain the spatial information of the skeleton sequence; perform convolution operation on the time dimension to obtain the behavior category of the skeleton sequence. The natural connection edge of the present invention can learn the basic human behavior characteristics, and at the same time, the unnatural connection edge can learn additional behavior characteristics, and the natural connection edge and the unnatural connection edge together form a graph, which can more fully represent the human body movement information , to improve recognition performance.

Description

technical field [0001] The invention belongs to the field of computer vision and deep learning, and in particular relates to a human skeleton behavior recognition method, system and device based on a graph convolutional network. Background technique [0002] As an important research field of computer vision, behavior recognition aims to distinguish the category of human behavior from a given video. Behavior recognition is widely used and has important research value in many fields such as smart home, motion analysis, video surveillance, and human-computer interaction. The current behavior recognition methods are mainly researched from two perspectives based on RGB video and based on human skeleton joints. The method based on RGB video is easily affected by illumination and occlusion, and has poor robustness, while the method based on human bone joint points is extremely discriminative, unaffected by illumination, and relatively sensitive to viewing angle transformation and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/23G06N3/045G06F18/2411
Inventor 原春锋吕红杰李兵段运强胡卫明刘雨帆
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products