Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Guide filtering-based two-stage remote sensing image fusion method

A remote sensing image fusion and guided filtering technology, which is applied in image enhancement, image data processing, instruments, etc., can solve problems such as not being suitable for satellite sensors, spatial and spectral distortion of fused images, and spectral distortion of component replacement methods

Inactive Publication Date: 2018-01-05
NORTHWESTERN POLYTECHNICAL UNIV
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Classical component replacement methods include Intensity-Hue-Saturation (IHS), Gram Schmidt (GS), Principal Component Analysis (PCA) methods, etc. Due to the complex spectral response between panchromatic light and multispectral images, component replacement methods will lead to serious problems. spectral distortion
The model optimization method is to establish a recovery optimization model based on panchromatic light images, multispectral images and the relationship between them. Commonly used methods include sparserepresentation (SR), etc. Due to the different spectral responses of remote sensing platforms, these hypothetical models may not be suitable for all satellite sensors. Therefore fusing the images produces spatial and spectral distortions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Guide filtering-based two-stage remote sensing image fusion method
  • Guide filtering-based two-stage remote sensing image fusion method
  • Guide filtering-based two-stage remote sensing image fusion method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be further described below in conjunction with the accompanying drawings and embodiments, and the present invention includes but not limited to the following embodiments.

[0019] The present invention comprises the following steps:

[0020] Assuming the original multispectral image (MS i ) contains N bands, and the subscript i=1,...,N indicates the i-th band of the corresponding multi-band image.

[0021] The first step, upsampling multispectral image

[0022] The bicubic interpolation method is used to upsample the original multispectral image, and the upsampled image is recorded as LMS i .

[0023] The second step, histogram matching

[0024] Calculate the intensity component INT, which can be obtained from a suitable upsampled multispectral image (LMS i ) linear combination model estimation:

[0025]

[0026] Among them, the linear combination coefficient is obtained by solving the following optimization model:

[0027]

[002...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a guide filtering-based two-stage remote sensing image fusion method. Image fusion is performed by adopting guide filtering models of two guide modes in two-stage operation; preprocessed images are preliminarily fused by utilizing multi-channel guide filtering; a fusion result is subjected to space resolution increment by utilizing single-channel guide filtering through designing a proper enhanced model; and finally original spectral information of multispectral images and space information of full-color images are fused to obtain a high-quality fused image. The method is an effective fusion method suitable for fusion of high-resolution satellite-borne multispectral and full-color images.

Description

technical field [0001] The invention relates to a visualization enhancement processing technology for remote sensing images, in particular to an image fusion method for spaceborne multispectral images and panchromatic images. Background technique [0002] Due to technical limitations such as the physical characteristics of the sensor, in order to obtain a higher spectral resolution, the spatial resolution is often sacrificed to a certain extent, and if the primary goal is to obtain a higher spatial resolution, the spectral resolution must be reduced. at a price. In order to solve this contradiction, remote sensing platforms, such as QuickBird, WorldView-2, and GeoEye-1, are often equipped with sensors with different characteristics to acquire panchromatic images with high spatial resolution and multispectral images with high spectral resolution respectively, forming complementary image dataset. By fusing these two types of images, the spatial resolution of the multispectra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/40G06T5/50
Inventor 李旭张艺鸣高昂李立欣
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products