A Joint Optimization Method for Spatiotemporal Consistency and Feature Center EMD Adaptive Video Stabilization

An adaptive and consistent technology, applied in color TV parts, TV system parts, TV, etc., to reduce clipping area, shake video stabilization, improve stability and adaptability

Active Publication Date: 2021-08-27
BEIHANG UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technical problem solved by the present invention is: to overcome the stability and robustness problems of the existing video stabilization methods, to provide a joint optimization method of temporal-spatial consistency and feature center EMD adaptive video stabilization, which automatically Adaptively handles video stabilization, image saliency protection, parallax reduction, adaptive smoothing, crop area reduction, and video completion, improving the stability, generality, accuracy, and adaptability of video enhancement processing, improving video integrity sex

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Joint Optimization Method for Spatiotemporal Consistency and Feature Center EMD Adaptive Video Stabilization
  • A Joint Optimization Method for Spatiotemporal Consistency and Feature Center EMD Adaptive Video Stabilization
  • A Joint Optimization Method for Spatiotemporal Consistency and Feature Center EMD Adaptive Video Stabilization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] The present invention will be described in detail below in conjunction with the accompanying drawings and embodiments.

[0056] Such as figure 1 Shown, the steps of the present invention are:

[0057] (1) Extract image features by SIFT method, perform feature matching, and obtain saliency vectors, and use this as a benchmark to perform spatially consistent deformation.

[0058] (2) Starting from the position of the viewpoint, deform each image frame obtained in step (1), reacquire the feature set, construct a SIFT-based spatial structure matrix, extract rotation, translation, and scaling information, and construct the original motion signal, according to the adaptive The intrinsic modulus function optimization algorithm obtains new motion signals.

[0059] (3) The adaptive motion signal calculated in step (2) is used as a new input signal. According to the feature center algorithm, the motion trend of the original signal is protected as much as possible while the jitt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a joint optimization method of spatio-temporal consistency and feature center EMD self-adaptive video stabilization. On the basis of decomposing noise signals by using EMD method, aiming at video anti-shake, the estimation method based on spatial structure consistency matrix is ​​used for jittering video The saliency protection, inspection elimination, adaptive smoothing, cropping area reduction, video completion and other technologies have improved the stability, versatility, accuracy and adaptability of video enhancement processing, and improved the integrity of the video.

Description

technical field [0001] The invention relates to a joint optimization method of temporal-spatial consistency and feature center EMD (empirical mode decomposition, full name EmpiricalMode Decomposition) adaptive video stabilization, which belongs to the technical field of computer vision enhancement. Background technique [0002] Handheld devices such as mobile phones, camcorders, tablets, and general purpose cameras have become fashionable for amateurs, but because of the simplistic stabilization of the devices, the video captured by these handheld devices is often wobbly and unsightly. Very uncomfortable. Video stabilization technology is designed to remove the jitters and vibrations between frames that are visible in shaky videos. It is one of the most active research topics in the field of computer vision, and can be applied to many high-level video enhancement applications, such as human observation, video recognition, video detection, video tracking, video compression, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/00G06T3/40G06K9/46H04N5/232
CPCG06T3/4007G06T5/005G06V10/462H04N23/682
Inventor 郝爱民李晓李帅秦洪
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products