Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-sensor video fusion and de-noising method and device

A video fusion, multi-sensor technology, applied in the field of image processing, can solve problems such as contrast reduction, inability to ensure temporal stability and consistency of image sequences, and interference

Inactive Publication Date: 2016-12-07
BEIJING UNIV OF POSTS & TELECOMM
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The external environment where the moving target is located is complex and changeable. During the process of capturing images by the sensor, it is often affected by interference factors such as noise. When the image is collected, interference information is also introduced, which reduces the quality of the image captured by the sensor. The visual quality is degraded, so it is necessary to eliminate noise and other interference information at the same time during the fusion process
[0003] The traditional fusion method is a direct weighted fusion of the original image, which easily leads to a decrease in contrast and the introduction of interference
Fusion methods based on multi-scale decomposition, image fusion based on high-order singular value decomposition, and image fusion based on Markov random fields implemented in the transform domain can ensure the quality of single-frame images through frame-by-frame fusion, but they are very difficult to achieve. It is difficult to maintain the temporal consistency and stability of the image sequence, and satisfactory results cannot be obtained due to the insufficient ability to represent motion information
[0004] Most of the existing image fusion methods still only focus on the combination of useful pixels in the original image, and rarely consider the processing of noise and other interference information.
Noise is easily introduced into the fused image together with useful pixels if there is no further processing to remove noise and other interfering information
Inability to ensure temporal stability and consistency of image sequences while improving image quality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensor video fusion and de-noising method and device
  • Multi-sensor video fusion and de-noising method and device
  • Multi-sensor video fusion and de-noising method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0091] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be described in further detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0092] It should be noted that all expressions using "first" and "second" in the embodiments of the present invention are to distinguish two entities with the same name but different parameters or parameters that are not the same, see "first" and "second" It is only for the convenience of expression, and should not be construed as a limitation on the embodiments of the present invention, which will not be described one by one in the subsequent embodiments.

[0093] refer to figure 1 As shown, it is a schematic flow chart of the multi-sensor video fusion and noise reduction method in the first embodiment of the present invention. The multi-sensor video fusion and noise reduction method includes:

[0094]Step 101, converti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-sensor video fusion and de-noising method and device. The method comprises the steps of converting the input video frame to a frequency domain, and separately generating a low-frequency sub-band coefficient and a high-frequency sub-band coefficient; separately fusing the low-frequency sub-band coefficient and the high-frequency sub-band coefficient; and performing inverse transformation on the fused low-frequency sub-band coefficient and high-frequency sub-band coefficient to obtain fused video. Therefore, the multi-sensor video fusion and de-noising method and device can realize multi-sensor video trans-scale fusion and de-noising by video frame decomposition based on three-dimensional shearlet transformation, high-frequency coefficient fusion strategy based on 3D PCNN (Pulse Coupled Neural Network) and low-frequency coefficient fusion strategy based on remarkable 3D PCNN.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a method and device for multi-sensor video fusion and noise reduction. Background technique [0002] In order to obtain the complete information of the scene, the content of the same scene needs to be captured by multiple different sensors at the same time. In order to fully and efficiently utilize these video contents captured from multiple sensors, multiple different video contents need to be combined into one video sequence, which is achieved using image sequence fusion methods. Image sequence fusion can synthesize multiple image sequences from different sensors into a single image sequence. The synthesized image sequence contains all the important information in the original image, eliminating redundancy and improving the usability of information. The external environment where the moving target is located is complex and changeable. During the process of capturing i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N5/262G06T5/00
CPCH04N5/2622G06T2207/10016G06T2207/20221G06T5/70
Inventor 杜军平徐亮
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products