Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Behavior recognition technical method based on deep learning

A recognition technology and deep learning technology, applied in the field of computer video recognition, can solve the problems of video surveillance system intelligence that needs to be improved, and achieve the effect of improved recognition rate and good feature expression ability

Pending Publication Date: 2019-08-30
XIDIAN UNIV
View PDF0 Cites 71 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The present invention overcomes the problem that the intelligence of the video monitoring system needs to be improved in the prior art, and provides a behavior recognition technology method based on deep learning with good processing effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior recognition technical method based on deep learning
  • Behavior recognition technical method based on deep learning
  • Behavior recognition technical method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The behavior recognition technology method based on deep learning of the present invention will be further described below in conjunction with accompanying drawing and specific embodiment: Contain following steps, step 1, introduce 3D convolutional neural network into double-stream convolutional neural network, and adopt dual-stream convolutional neural network A deeper spatio-temporal dual-stream CNN-GRU neural network model is built by combining network and GRU network;

[0049] Step 2. Use the 3D convolutional neural network for the spatial stream and the temporal stream in the spatio-temporal dual-stream convolutional neural network, input more video frames into the network to participate in the training of the network, and extract the time domain and spatial domain features of the video;

[0050] Step 3. Fusion of time-domain features and space-domain features into a time-ordered spatio-temporal feature sequence, using the spatio-temporal feature sequence as the inp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a behavior recognition technical method based on deep learning, which solves a problem that the intelligentization of a video monitoring system in the prior art needs to be improved. The method comprises the following steps of constructing a deeper space-time double-flow CNN-GRU neural network model by adopting a mode of combining a double-flow convolutional neural networkand a GRU network; extracting the time domain and space domain features of the video; extracting the long-time serialization characteristics of the spatial-temporal characteristic sequence according to the capability of the GRU network for memorizing information, and carrying out the behavior recognition of a video through employing a softmax classifier; proposing a new related entropy-based lossfunction; and with the help of a method for processing the mass information by referring to a human brain visual nerve attention mechanism, introducing an attention mechanism before a space-time double-flow CNN-GRU neural network model performs space-time feature fusion. The accuracy of the model provided by the technology is 61.5%, and compared with an algorithm based on a double-flow convolutional neural network, the recognition rate is improved to a certain extent.

Description

Technical field [0001] The present invention relates to computer video recognition technology, and in particular to a behavior recognition technology method based on deep learning. Background technique [0002] With the development of network technology and the rapid advancement of multimedia technology, various video media have become ubiquitous in people's lives. Video has become a commonly used method of transmitting information in modern society, and the growth trend of video services is still increasing rapidly. With the widespread use of video devices such as digital cameras and smartphones, people have become popular in sending short videos as a way to transmit information to each other, replacing traditional text and pictures. The cost of producing videos is getting lower and lower, and the Internet is widely spread. A large amount of video information is generated every day, and a huge video library has been generated on the Internet. Huge data resources have broug...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/20G06V20/52G06N3/044G06N3/045G06F18/214G06F18/24
Inventor 来兴雪陈颖
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products