Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing image fusion method based on contourlet transform and guided filter

A remote sensing image fusion and contourlet transform technology, applied in the field of image processing, can solve the problems of reduced image contrast, unclear expression of image edge characteristics, etc., and achieves the effects of high speed, clear edge detail information, and high efficiency.

Active Publication Date: 2016-07-13
XIDIAN UNIV
View PDF2 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The invention discloses a remote sensing image fusion method based on contourlet transformation and guided filtering, which mainly solves the problems of reduced image contrast and unclear expression of image edge characteristics caused by existing image fusion methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing image fusion method based on contourlet transform and guided filter
  • Remote sensing image fusion method based on contourlet transform and guided filter
  • Remote sensing image fusion method based on contourlet transform and guided filter

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0024] The present invention proposes a remote sensing image fusion method based on contourlet transformation and guided filtering, see figure 1 , including the following steps:

[0025] (1) Two multi-spectral images and panchromatic images to be fused are obtained by shooting the same target. These two images are obtained by shooting the same target at the same time by equipment with different imaging characteristics, such as image 3 shown, where image 3 are two remote sensing images representing surface information, image 3 a is the multispectral image, image 3 b is a full-color image. Carry out contourlet transformation respectively for these two source images, the number of layers of contourlet transformation in the present invention is 3 layers. Conturlet decomposes to obtain the high-frequency coefficients and low-frequency coefficients corresponding to the source image.

[0026] (2) The high-frequency coefficients of the two source images are fused using a weig...

Embodiment 2

[0031] The remote sensing image fusion method based on contourlet transformation and guided filtering is the same as embodiment 1, see figure 2 , wherein the weighted fusion method based on guided filtering described in step (2), this weighted fusion method can also be referred to as the weighted fusion rule based on guided filtering, the fusion process includes the following steps:

[0032] (2a) with H 1 and H 2 to represent a set of high-frequency coefficients corresponding to a multispectral image and a panchromatic image in a certain scale direction, and H 1 and H 2 Based on the comparison of each pixel, the weight map W is obtained 1 and W 2 :

[0033]

[0034] (2b) The weight map obtained by direct comparison contains noise, and the edges are not aligned, so the weight map W 1 and W 2 Apply guided filter filtering, two high frequency coefficients H 1 and H 2Respectively as the guide image, when using the guide filter for filtering processing, the parameter s...

Embodiment 3

[0044] The remote sensing image fusion method based on contourlet transformation and guided filtering is the same as embodiment 1-2, see figure 2 Among them, the low-frequency fusion method based on the large area energy described in step (3), this fusion method is also called the low-frequency fusion rule based on the large area energy, and its fusion process includes the following steps:

[0045] (3a) Calculate the area energy of the two low-frequency coefficients respectively. In this example, a window range with an area radius of l=4 is selected. The selection of the window range is related to the specific image content, and has little influence on the experimental results, and can be selected by oneself during the experiment. use C 1 To represent the low-frequency coefficients corresponding to the multispectral image respectively, use C 2 to represent the low-frequency coefficients corresponding to panchromatic images, and the low-frequency coefficients C of multispect...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a remote sensing image fusion method based on contourlet transform and guided filter, mainly to solve problems of image contrast reduction and unclear image edge characteristic expression caused by the existing image fusion method. The method particularly comprises steps: the same target is photographed to obtain a to-be-fused multispectral image and a to-be-fused panchromatic image for contourlet transform, and corresponding high-frequency coefficients and low-frequency coefficients are obtained; a weighted fusion method based on guided filter is applied to the high-frequency coefficients of the two source images for fusion, and high-frequency coefficients of the fused image are obtained; a region energy maximum method is applied to the low-frequency coefficients of the two source images for fusion, and low-frequency coefficients of the fused image are obtained; contourlet inverse transform is applied to the high-frequency coefficients and the low-frequency coefficients after fusion, and a fused image of the target is obtained. The method of the invention combines the contourlet transform and the guided filter, the fusion effects are obvious, the image evaluation parameters are high, and the method can be applied to aspects of image analysis and processing, surveying and mapping, geology and the like.

Description

technical field [0001] The invention belongs to the technical field of image processing, and mainly relates to image fusion technology, specifically a remote sensing image fusion method based on contourlet transformation and guided filtering, which can be used for image analysis and processing, and in surveying and mapping, geology, It has a wide range of applications in agriculture and meteorology. Background technique [0002] Image fusion is the process of combining two or more images taken for the same target into a single image. The images to be fused are usually images taken by devices with different imaging characteristics at the same time or images taken by the same imaging device at different time points. Therefore, these images to be fused have different characteristics. By fusing these images together through a certain algorithm, the resulting composite image can provide richer content information, which is convenient for people to further study the target and an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00G06T5/50
CPCG06T5/50G06T5/70
Inventor 那彦任梦乔赵丽
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products