Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method, device, computer equipment and storage medium for video action classification

A video and action technology, applied in computer parts, computing, neural learning methods, etc., can solve problems such as low efficiency of classification operations

Active Publication Date: 2020-10-23
BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Due to the huge number of short videos received by the short video platform, if the actions of the objects in each short video are manually classified, the efficiency of the classification operation will be extremely low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method, device, computer equipment and storage medium for video action classification
  • Method, device, computer equipment and storage medium for video action classification
  • Method, device, computer equipment and storage medium for video action classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0075] Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present invention. Rather, they are merely examples of apparatuses and methods consistent with aspects of the invention as recited in the appended claims.

[0076] With the development of society, more and more users like to use fragmented time to watch or shoot short videos. When any user uploads a short video shot to the short video platform, the short video platform needs to classify the actions of the objects in the short video, such as dancing, climbing trees, drinking water, and so on. Then you can add corresponding tags to the short video ac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The disclosure relates to a video action classification method, device, computer equipment and storage medium, and belongs to the technical field of machine learning models. The method comprises: obtaining the video to be classified, determining a plurality of video frames in the video to be classified; inputting the plurality of video frames into the optical flow replacement module in the optimized video action classification model after training, and obtaining the corresponding The optical flow feature information; input multiple video frames into the three-dimensional convolutional neural module in the optimized video action classification model after training, and obtain the spatial feature information corresponding to multiple video frames; based on optical flow feature information and spatial feature information , to determine the classification category information corresponding to the video to be classified. With this disclosure, multiple video frames of the video to be classified can be directly used as the input of the optical flow replacement module in the model, and the optical flow replacement module can directly extract the optical flow feature information corresponding to the multiple video frames of the video to be classified, further improving improve the efficiency of classification processing.

Description

technical field [0001] The present disclosure relates to the technical field of machine learning models, and in particular to a video action classification method, device, computer equipment and storage medium. Background technique [0002] With the development of society, more and more users like to use fragmented time to watch or shoot short videos. When any user uploads a short video shot to the short video platform, relevant personnel in the short video platform can view the short video and classify the actions of the objects in the short video according to subjective ideas, such as dancing, climbing trees, drinking water etc. Then, relevant personnel can add corresponding tags to the short video according to the classification results. [0003] In the process of realizing the present disclosure, the inventors found at least the following problems: [0004] Due to the huge number of short videos received by the short video platform, if the actions of objects in each s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00
CPCG06V20/41G06N3/08G06V20/46G06V10/82G06V10/806G06N3/045G06F18/253G06N3/04G06F18/2134G06F18/2431
Inventor 张志伟李岩
Owner BEIJING DAJIA INTERNET INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products