Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Time sequence action detection method and device, storage medium and terminal

A motion detection and timing technology, applied in the computer field, can solve the problems of unaligned time boundary range, content deviation, low calculation efficiency, etc., and achieve the effect of reducing information loss, utilization rate and calculation efficiency improvement.

Pending Publication Date: 2022-05-03
TERMINUSBEIJING TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the self-attention module can focus on features in different ranges, extreme scale (duration) changes cause the network model to have a high probability of misalignment between the receptive domain range and the corresponding behavioral time boundary range, that is, the content and time within the model's perception range The content within the bounds is biased
In order to reduce deviation, the current mainstream multi-stage temporal action detection network first selects high-confidence behavior time boundaries (start time point and end time point, usually in seconds) through the nomination network, and then according to the selected behavior Time boundary, the content within the boundary range is intercepted at the video level, and then input to the detection network for boundary refinement and behavior classification, but this method requires a large amount of memory access, data exchange between video memory and memory, and even hard disk reading and writing, resulting in GPU utilization drops, calculation efficiency is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Time sequence action detection method and device, storage medium and terminal
  • Time sequence action detection method and device, storage medium and terminal
  • Time sequence action detection method and device, storage medium and terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The following description and drawings illustrate specific embodiments of the invention sufficiently to enable those skilled in the art to practice them.

[0061] It should be clear that the described embodiments are only some of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0062] When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present invention. Rather, they are merely examples of apparatuses and methods consistent with aspects of the invention as recited in the appended claims.

[0063] In the descriptio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a time sequence action detection method and device, a storage medium and a terminal. The method comprises the following steps: acquiring an action video; inputting the action video into a pre-trained action detection network; wherein the pre-trained action detection network is generated based on training of a plurality of local-global fusion features, and the plurality of local-global fusion features are constructed according to a local self-attention module and a global self-attention module; and outputting the time sequence action information in the action video. According to the method, the self-attention module is adopted, information of long-time-sequence features extracted by a convolutional layer in a video can be reserved, global features can be captured on the long-time-sequence features, information loss is reduced, meanwhile, multiple local-global fusion features are constructed through the local self-attention module and the global self-attention module, and the fusion accuracy of the local-global fusion features is improved. Context information of a time sequence video can be fused and utilized, so that the utilization rate and the calculation efficiency of a GPU in a network training process based on a plurality of local-global fusion features are greatly improved.

Description

technical field [0001] The present invention relates to the field of computer technology, in particular to a time sequence action detection method, device, storage medium and terminal. Background technique [0002] With massive increases in online and personal media archives, people are generating, storing and consuming vast amounts of video. Under this trend, the development of efficient algorithms to intelligently parse video data is encouraged. One of the fundamental challenges for the success of these improvements is the temporal and spatial aspects of action detection in videos, ie, temporal action detection. The current mainstream temporal action detection network is mainly composed of a CNN network layer and a fully connected layer. The CNN network layer is mainly responsible for feature extraction, and the fully connected layer is mainly responsible for behavior classification. In general, the input shape size and output shape size of the CNN network layer and the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/10G06V40/20G06V10/44G06V10/766G06V10/764G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/047G06N3/048G06N3/045G06F18/241G06F18/2415G06F18/25
Inventor 刘斌张睿张先福蒙学文
Owner TERMINUSBEIJING TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products