Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An Action Recognition Method Based on Graph Convolution and Capsule Neural Network

A convolutional neural network and neural network technology, applied in neural learning methods, biological neural network models, character and pattern recognition, etc., can solve problems such as failure to draw inferences from one example, and the relative relationship between component orientation and space is not important, and achieve The effect of improving recognition ability

Active Publication Date: 2022-05-17
WUHAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] Capsule neural network: The traditional convolutional neural network has an important problem. The orientation of components and the relative relationship in space are not important to it. CNN only cares about whether there are features, and the pooling layer in CNN discards A lot of information, such as important location information; CNN can only recognize an object after feeding a large amount of data, and it has not reached the real sense of inference

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An Action Recognition Method Based on Graph Convolution and Capsule Neural Network
  • An Action Recognition Method Based on Graph Convolution and Capsule Neural Network
  • An Action Recognition Method Based on Graph Convolution and Capsule Neural Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0111] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0112] The present invention introduces the NTU RGB+D data set as a multi-frame human continuous action image;

[0113] Combine below Figure 1 to Figure 3 The specific embodiment of the present invention is introduced as a behavior recognition method based on graph convolution and capsule neural network, which specifically includes the following steps:

[0114] Step 1: Collect multiple frames...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes a behavior recognition method based on graph convolution and capsule neural network. The present invention obtains the spatial coordinates of human body joint points in each frame of continuous human motion images by manual marking, and further constructs the spatial coordinate vectors of human body joint points; maps the spatial coordinate vectors into high-dimensional feature vectors through multi-layer perceptrons, and combines the action association The joint point adjacency matrix is ​​constructed according to the principle; the velocity space vector of the joint point is constructed according to the space coordinates, and the acceleration space vector of the joint point is further constructed; the convolutional neural network is used to extract features, and the capsule neural network is used for action classification. The capsule convolutional neural network is constructed by connecting the product neural network and the capsule neural network in series; the trained capsule convolutional neural network is obtained by repeating the training set for multiple generations. The invention conforms to the characteristics of actual motion, and the propagation of the features on the graph is more in line with the actual situation, and can effectively retain the features for classification and improve the recognition ability of the model.

Description

technical field [0001] The invention belongs to the research category of behavior recognition, and in particular relates to a behavior recognition method based on graph convolution and capsule neural network. Background technique [0002] Behavior recognition is an extremely challenging direction in the field of computer vision. It has important applications in many aspects, such as intelligent visual monitoring. In some places with high security requirements, such as supermarkets and banks, it can detect whether there is any suspicious behavior of people in real time. . Human-computer interaction, in the field of advanced user interfaces, it is hoped that future robots can understand human behavior and respond accordingly in order to communicate with humans, which can greatly improve human living standards. Based on content retrieval, in the era of information explosion, if you can label videos, it will help you accurately retrieve the desired videos in massive data. In s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06N3/04G06N3/08
CPCG06N3/08G06V40/23G06N3/045
Inventor 蔡贤涛王森倪波
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products