Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Group activity identification method based on coherence constraint graph long-short-term memory network

A technology of long-term memory and group activity, applied in neural learning methods, character and pattern recognition, biological neural network models, etc., and can solve problems such as exaggerating outlier movements

Inactive Publication Date: 2019-12-17
NANJING UNIV OF SCI & TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the above method assumes that the motions of all people contribute equally to the group activity, which suppresses the contribution of some coherent motions to the overall activity and exaggerates some outlier motions that are not related to the group activity.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Group activity identification method based on coherence constraint graph long-short-term memory network
  • Group activity identification method based on coherence constraint graph long-short-term memory network
  • Group activity identification method based on coherence constraint graph long-short-term memory network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] 1. A group activity recognition method based on a coherence-constrained graph long-short-term memory network, including learning the movement state of individuals under the coherence constraints of spatio-temporal context, quantifying the contribution of individual movements to group activities under the coherence constraints of the global context, and using aggregated LSTM to obtain There are four processes of hidden representation of group activities and obtaining probability class vectors of group activities.

[0053] Learning the motion state of an individual under the constraints of spatio-temporal context coherence includes the following steps:

[0054] Step 1. Use a pre-trained convolutional neural network (CNN) model to extract the CNN features of each person in the detected and tracked bounding box. The convolutional neural network used is compatible with AlexNet, VGG, ResNet and GoogLeNet.

[0055] Step 2, add a time confidence gate to the ordinary graph long ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a group activity recognition method based on a coherence constraint graph long-short-term memory network, and the method comprises the following steps: (1), enabling the CNN features of all persons to serve as the input of the coherence constraint graph long-short-term memory network, and jointly learning the individual motion states of all persons along with time under thecoherence constraint of a space-time context; (2) quantifying contributions of related motions by learning attention factors corresponding to different motions by utilizing an attention mechanism of global context coherence; (3) aggregating all single motion states weighted by different attention factors into a hidden representation of the whole activity by adopting aggregation LSTM in each time step, and inputting the hidden representation of each activity into a softmax classifier; and (4) averaging the output of the softmax classifier of each time step to obtain a probability class vector of the group activity so as to deduce the class of the group activity.

Description

technical field [0001] The invention relates to an action recognition technology in the field of computer vision, in particular to a group activity recognition method based on a coherent constraint graph long short-term memory network. Background technique [0002] Traditional action recognition such as single-person recognition and two-person interaction usually consists of one person or two people appearing in a video, and such tasks have achieved satisfactory performance in the past few decades. Compared with traditional human behaviors, group activities are more complex but common behaviors in the scene. Unlike solo activities and two-person interactions, group activities are usually conducted by multiple people at the same time. Therefore, in group activity recognition, we need to model the behavior of multiple individuals and their interactions. This is a fine-grained recognition task, which is much more difficult than traditional single-person action recognition or ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/049G06N3/08G06V20/53G06N3/045G06F18/24
Inventor 舒祥波张瑞鹏唐金辉严锐宋砚
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products