Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Time sequence behavior capture box generation method and device based on self-attention network

A technology of attention and attention model, which is applied in image data processing, instrumentation, computing, etc., can solve the problems of time-consuming recursive operation and limited range of contextual information, and achieve the effect of improving the generation accuracy

Active Publication Date: 2019-06-25
TENCENT TECH (SHENZHEN) CO LTD
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among them, the recurrent neural network models the sequence context information through recursive operations. However, the recursive operation is very time-consuming, and the convolutional neural network can be parallelized to achieve acceleration. However, the range of context information captured by stacking multiple convolutional layers is limited. limit

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Time sequence behavior capture box generation method and device based on self-attention network
  • Time sequence behavior capture box generation method and device based on self-attention network
  • Time sequence behavior capture box generation method and device based on self-attention network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0029] It should be noted that the terms "first" and "second" in the description and claims of the present invention and the above drawings are used to distinguish similar objects, but not necessarily used to describe a specific sequence or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the invention described herein can be practiced in sequences other than those illustrate...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a time sequence behavior capture box generation method and device based on a self-attention network. The method comprises the steps of obtaining a target video; obtaining an input feature matrix according to the target video, and inputting the input feature matrix into a first self-attention model to obtain an action probability sequence, a starting point probability sequence and an ending point probability sequence; wherein the first self-attention model is obtained by connecting a plurality of groups in series, each group comprises at least one self-attention unit, and the target feature matrix corresponding to each group forms feature offset in time sequence; generating a candidate capture box set according to the action probability sequence, the start point probability sequence and the end point probability sequence; and extracting relative position information among the candidate capture boxes, and inputting the relative position information into a second self-attention model to obtain a target capture box set. According to the method, the target capture frame set is obtained by arranging the two self-attention models, and the generation precision of the target capture frame set is improved.

Description

technical field [0001] The present invention relates to the field of machine learning, in particular to a method and device for generating time series behavior capture frames based on a self-attention network. Background technique [0002] Videos in real life often contain a series of continuous complex actions and related events. Based on the research needs of timing behavior in videos, the task of timing behavior capture frame generation has also become a research hotspot. However, how to generate high-quality capture frames for videos and make them The capture frame can cover the real action instances in the video with a high recall rate and a high cross-merge ratio, but it has become a research difficulty. In the prior art, in order to obtain a time-series behavior capture frame, a time-series behavior learning task can be constructed, specifically, a recurrent neural network (recurrent neural network, RNN) and a convolutional neural network (convolution neural network, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/20
Inventor 郭大山姜文浩刘威
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products