Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, apparatus, computer device and storage medium for classifying video actions

A video and motion technology, applied in computer parts, computing, neural learning methods, etc., can solve the problem of low efficiency of classification operations and achieve the effect of improving efficiency

Active Publication Date: 2019-02-22
BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
View PDF9 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Due to the huge number of short videos received by the short video platform, if the actions of the objects in each short video are manually classified, the efficiency of the classification operation will be extremely low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, apparatus, computer device and storage medium for classifying video actions
  • Method, apparatus, computer device and storage medium for classifying video actions
  • Method, apparatus, computer device and storage medium for classifying video actions

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0075] Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present invention. Rather, they are merely examples of apparatuses and methods consistent with aspects of the invention as recited in the appended claims.

[0076] With the development of society, more and more users like to use fragmented time to watch or shoot short videos. When any user uploads a short video shot to the short video platform, the short video platform needs to classify the actions of the objects in the short video, such as dancing, climbing trees, drinking water, and so on. Then you can add corresponding tags to the short video ac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method, an apparatus, a computer device and a storage medium for classifying video actions, belonging to the technical field of machine learning models. The method comprisesthe following steps: obtaining a video to be classified and determining a plurality of video frames in the video to be classified; A plurality of video frames are inputted into an optical flow substitution module in an optimized video motion classification model after training, and optical flow characteristic information corresponding to the plurality of video frames is obtained. A plurality of video frames are input into a three-dimensional convolution neural module in the optimized video motion classification model after training, and the spatial characteristic information corresponding tothe plurality of video frames is obtained. Based on the optical flow characteristic information and the spatial characteristic information, the classification category information corresponding to thevideo to be classified is determined. By adopting the invention, a plurality of video frames of a video to be classified can be directly used as inputs of an optical flow substitution module in the model, and the optical flow substitution module can directly extract optical flow characteristic information corresponding to the plurality of video frames of the video to be classified, thereby further improving the efficiency of classification processing.

Description

technical field [0001] The present disclosure relates to the technical field of machine learning models, and in particular to a video action classification method, device, computer equipment and storage medium. Background technique [0002] With the development of society, more and more users like to use fragmented time to watch or shoot short videos. When any user uploads a short video shot to the short video platform, relevant personnel in the short video platform can view the short video and classify the actions of the objects in the short video according to subjective ideas, such as dancing, climbing trees, drinking water etc. Then, relevant personnel can add corresponding tags to the short video according to the classification results. [0003] In the process of realizing the present disclosure, the inventors found at least the following problems: [0004] Due to the huge number of short videos received by the short video platform, if the actions of objects in each s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V20/41G06N3/08G06V20/46G06V10/82G06V10/806G06N3/045G06F18/253G06N3/04G06F18/2134G06F18/2431
Inventor 张志伟李岩
Owner BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products