Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video frame insertion method based on spatio-temporal joint attention

An attention and video frame technology, applied in the fields of video processing, slow motion generation, and video post-processing, it can solve the problems of motion edge artifacts, inaccurate frame insertion results, etc., to improve accuracy, improve video frame insertion speed, The effect of low network parameters

Active Publication Date: 2022-06-07
XIDIAN UNIV
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this algorithm does not clearly capture the dependence of the time dimension between input frames, resulting in serious artifacts in the motion margins of the generated frames, and inaccurate frame interpolation results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video frame insertion method based on spatio-temporal joint attention
  • Video frame insertion method based on spatio-temporal joint attention
  • Video frame insertion method based on spatio-temporal joint attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Below in conjunction with the accompanying drawings and specific embodiments, the present invention is described in further detail:

[0030] refer to figure 1 , the present invention comprises the steps:

[0031] Step 1) Obtain training sample set and test sample set:

[0032] For each selected V original videos including L image frames, each frame image is clipped through a clipping window with a size of H×W to obtain a video frame sequence corresponding to each original video after preprocessing, where H= 448, W=256 represent the length and width of the cropping window respectively, and mark the odd-numbered image frames and the even-numbered image frames in the video frame sequence corresponding to each original video after preprocessing, respectively, and then R image frames video frames with markers V 1 ={V 1 r |1≤r≤R} as the training sample set, the remaining S image frames are labeled with the video frame sequence As a test sample set, L=7, V=7564, R=3782,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a video frame insertion method based on spatio-temporal joint attention. The video frame insertion method comprises the following implementation steps: (1) acquiring a training data set and a data set; (2) constructing a video frame insertion network based on spatio-temporal joint attention; (3) carrying out iterative training on the video frame insertion network model; and (4) obtaining a video frame insertion result. According to the video frame insertion model based on space-time joint attention, the space-time relationship between input frames is captured by using a space-time attention mechanism, and modeling is performed on complex motion, so that high-quality video frame insertion is completed. Compared with most of the existing networks, the algorithm does not use additional optical flow input, avoids additional errors caused by optical flow estimation, enables the network parameter quantity to be low, and has practical application value.

Description

technical field [0001] The invention belongs to the technical field of video processing, and relates to a video frame insertion method, in particular to a video frame insertion method based on spatiotemporal joint attention, which can be used in the fields of slow motion generation, video post-processing and the like. Background technique [0002] Low temporal resolution can cause image aliasing, and produce artifacts, which degrade video quality, so temporal resolution becomes an important factor affecting video quality. The video frame insertion method inserts one or more intermediate frames between consecutive image frames to increase the temporal resolution and improve the video quality. [0003] The video frame interpolation method usually consists of two parts: motion estimation and pixel synthesis. Motion estimation refers to predicting the position of the pixel corresponding to the intermediate frame by calculating the motion of the pixel between the two frames befo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N7/01G06T7/269G06N3/04G06N3/08
CPCH04N7/014G06T7/269G06N3/08G06T2207/10016G06N3/045
Inventor 路文张弘毅冯姣姣张立泽胡健
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products