Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic and static characteristic-based video classification method

A video classification, dynamic and static technology, applied in the cross field, can solve the problems of unsatisfactory, high hardware requirements, poor real-time performance, etc., to achieve good accuracy and effectiveness, improve accuracy, and increase accuracy.

Active Publication Date: 2018-08-14
NANJING UNIV OF POSTS & TELECOMM
View PDF5 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The advantage of the optical flow method is that it can detect moving objects without knowing any information about the scene, but the computational complexity is high, the real-time performance is poor, and it has high requirements for hardware.
While training a standard RNN to solve problems that need to learn long-term temporal dependencies is not ideal

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic and static characteristic-based video classification method
  • Dynamic and static characteristic-based video classification method
  • Dynamic and static characteristic-based video classification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] Below in conjunction with accompanying drawing, technical scheme of the present invention is described in further detail:

[0052] A kind of video classification method based on static and dynamic features of the present invention comprises the following steps:

[0053] Step 1) input 1 video, described video is the video of user input, this video is decomposed into the video segment that has 1 frame, wherein the interval of each video segment is 5 frames;

[0054] Step 2) track the moving object in the input video in step 1) through the dense trajectory tracking algorithm (DT algorithm), and use the density-based noise space clustering algorithm (DBSCAN clustering algorithm) to isolate each frame of video to achieve the above The capture and tracking of dynamic information in the video; the DT algorithm is to densely sample feature points on multiple scales of the picture through grid division; the DBSCAN clustering algorithm starts from a selected core point and contin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dynamic and static characteristic-based video classification method. The problem that the video classification accuracy is not high is solved. The method comprises the stepsthat dynamic characteristics and static characteristics in video are processed, information is fused by means of Cholesky conversion, and then video classification is completed by using a GRU neural network; the dynamic characteristics of each video frame are captured through a DT algorithm, all video frames are isolated through a DBSCAN clustering algorithm, a motion frame is built in each frameof each video clip, the motion frames between the adjacent frames of each video clip are connected, and capturing and tracking of the dynamic characteristics are completed; by means of HoG and BoW methods, a dynamic information histogram generated by the dynamic characteristics and a static information histogram generated by a CNN neural network are fused by means of Cholesky conversion; finally,the GRU neural network is utilized for achieving video classification. Accordingly, by processing the dynamic and static information separately, the accuracy of video classification can be improved, and good implementation and robustness are achieved.

Description

technical field [0001] The invention relates to a video classification method based on dynamic and static features, and belongs to the cross technical fields of behavior recognition, machine learning and the like. Background technique [0002] In recent years, action recognition and classification in videos has become an important research topic in the field of computer vision, which has important theoretical significance and practical application value. [0003] With the development of my country's economy and society and the advancement of science and technology, the identification, analysis and understanding of tasks in videos has become an important content in the fields of social science and natural science. Has a wide range of applications. Compared with behavior recognition in static pictures, background changes in videos, tracking of dynamic objects, and high-dimensional data processing are more complex and therefore more challenging. [0004] The recognition of hum...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06T7/246
CPCG06T7/246G06T2207/20084G06T2207/20081G06T2207/10016G06T2207/20032G06V10/50G06F18/23213G06F18/241G06F18/253
Inventor 陈志周传岳文静陈璐刘玲掌静李争彦
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products