Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human Action Recognition Method Based on Recurrent Convolutional Neural Network

A technology of human action recognition and neural network, which is applied in the field of pattern recognition, machine learning and image classification, can solve the problem of low accuracy of human action recognition, and achieve the goal of minimizing intra-class differences, maximizing inter-class differences, and good recognition Effect

Active Publication Date: 2022-07-08
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the problems of the above research, the object of the present invention is to provide a human action recognition method based on a circular convolutional neural network, which solves the problems in the prior art due to the changes within and between action categories or the video is composed of continuous frames. Causes problems such as low accuracy of human motion recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human Action Recognition Method Based on Recurrent Convolutional Neural Network
  • Human Action Recognition Method Based on Recurrent Convolutional Neural Network
  • Human Action Recognition Method Based on Recurrent Convolutional Neural Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further described below with reference to the accompanying drawings and specific embodiments.

[0053] A human action recognition method based on recurrent convolutional neural network can be widely used in video-based category similarity recognition, including the following steps:

[0054] S1. Construct a data set, that is, randomly select sequence pairs of the same length from the public data set, each frame in each sequence includes RGB images and optical flow images; the public data sets are UCF101-split1 data set, HMDB51 data set, UCFSPORT data set or UCF11 dataset, in which two action segments in a sequence pair are from the same action category or from different action categories. Specifically: firstly, the video sequences in the public data set are cut into fixed-length action segments (segments) to obtain multiple sequences, and a pair of segments, namely sequence pairs, are randomly selected. This sequence pair can be from the sam...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human action recognition method based on a cyclic convolutional neural network, which belongs to the fields of image classification, pattern recognition and machine learning. Action recognition accuracy is low. The invention constructs a data set, namely randomly selects sequence pairs with the same length from a public data set, and each frame in each sequence includes an RGB image and an optical flow image; constructs a twin network, and each network in the twin network sequentially includes a CNN layer, an RNN layer and an optical flow image. Temporal Pooling layer; constructs a "recognition-verification" joint loss function; builds a deep convolutional neural network based on data set training and a "recognition-verification" joint loss function; based on the human action sequence pairs to be recognized, trained in turn A deep convolutional neural network and a trained "recognition-verification" joint loss function are used to obtain action class recognition results for sequence pairs. The present invention is used for human action recognition in images.

Description

technical field [0001] A human action recognition method based on a cyclic convolutional neural network is used for human action recognition in images, and belongs to the fields of image classification, pattern recognition and machine learning. Background technique [0002] Human action recognition is one of the hot and cutting-edge research topics in the field of computer vision and machine learning, and has broad application prospects in intelligent video surveillance, intelligent human-computer interaction, and content-based video analysis. [0003] The main problem to be solved in video-based human action recognition is to process and analyze the original image or image sequence data collected by the sensor (camera) through the computer, and learn and understand the human action and behavior. Human action recognition mainly includes the following three steps: first, detect appearance and motion information from image frames and extract low-level features; then model beha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06N3/04G06N3/08
CPCG06N3/084G06V40/23G06N3/045
Inventor 程建高银星汪雯苏炎洲白海伟
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products