Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for evaluating remote-sensing image fusion effect

A remote sensing image and evaluation method technology, which is applied in image enhancement, image analysis, image data processing, etc., can solve problems such as error-prone, difficult image quality, and no uniform standard for index selection, and achieve objective evaluation results and classification accuracy Improved effect

Inactive Publication Date: 2013-11-06
NORTHEAST INST OF GEOGRAPHY & AGRIECOLOGY C A S
View PDF2 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention aims to solve the problems that the main visual effect of the existing remote sensing image fusion effect evaluation method is relatively large, which is prone to errors, and there is no uniform standard for the selection of indicators in the objective mathematical statistical analysis, and it is difficult to comprehensively evaluate the image quality. A Method for Evaluation of Remote Sensing Image Fusion Effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for evaluating remote-sensing image fusion effect
  • Method for evaluating remote-sensing image fusion effect
  • Method for evaluating remote-sensing image fusion effect

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0020] Specific implementation mode 1: The remote sensing image fusion effect evaluation method in this implementation mode mainly includes the following steps:

[0021] Step 1: Preprocessing of the original image to be fused;

[0022] Step 2: The preprocessed multispectral image is fused with the panchromatic image to obtain a fused result image;

[0023] Step 3: Carry out object-oriented segmentation on the images before and after fusion, and obtain the remote sensing images after segmentation;

[0024] Step 4: Using classification rules to classify the segmented remote sensing images to obtain classification results of images before and after fusion;

[0025] Step 5: Evaluate the accuracy of the classification results from the perspectives of production accuracy, user accuracy, and Kappa coefficient, and compare and analyze the classification accuracy of images before and after fusion to evaluate the fusion effect.

[0026] The effect of this implementation mode:

[0027...

specific Embodiment approach 2

[0029] Specific embodiment two: the difference between this embodiment and specific embodiment one is: the preprocessing of the original image to be fused described in step one includes the following steps:

[0030] The quadratic polynomial method is used to geometrically register the original multispectral image to maintain geometric consistency with the panchromatic image;

[0031] The nearest neighbor interpolation method is used to resample the multispectral image, so that the pixel size of the image to be fused is consistent;

[0032] On the basis of the above processing, the image of the test area was cut out to obtain the original multispectral and panchromatic images to be fused in the same area. Other steps and parameters are the same as those in Embodiment 1.

specific Embodiment approach 3

[0033] Embodiment 3: The difference between this embodiment and Embodiment 1 or 2 is that in step 2, the preprocessed multispectral image and the panchromatic image are fused, and in the resulting fused image, the multispectral The principle of image and panchromatic image fusion is: use a certain fusion method to fuse the preprocessed multispectral image and panchromatic image. Other steps and parameters are the same as those in Embodiment 1 or Embodiment 2.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to remote-sensing image effect evaluation, in particular to a method for evaluating a remote-sensing image fusion effect. The method for evaluating the remote-sensing image fusion effect solves the problems that the man-made interference factor of a main visual effect of an existing method for evaluating the remote-sensing image fusion effect is large, errors can occur easily, a uniform standard does not exist in index selection in an objective mathematical statistic analysis, and image quality is hard to evaluate comprehensively. The method includes the steps that first, an original image to be fused is preprocessed; second, fusion processing is conducted on a preprocessed multispectral image and a panchromatic image; third, object-oriented segmentation is conducted on the images before and after fusion; fourth, a classification rule is used for classifying the segmented remote-sensing image; fifth, the classification result is precisely evaluated from the three aspects of production precision, user precision and a Kappa coefficient, and contrastive analysis is carried out on the classification accuracy of the images before and after fusion so that the fusion effect can be evaluated. The method for evaluating the remote-sensing image fusion effect can be applied to the technical field of remote-sensing image processing.

Description

technical field [0001] The invention relates to remote sensing image effect evaluation, in particular to a remote sensing image fusion effect evaluation method. It belongs to remote sensing image processing technology. Background technique [0002] With the rapid development of modern remote sensing and related technologies, there are many algorithms for remote sensing image fusion, but no unified theoretical system has been formed, and the effects of various fusion methods are difficult to determine quantitatively. At present, there are two views on the quality evaluation of image fusion. One is that any image is for people to see, so the evaluation method based on visual effects should be adopted, based on subjective visual evaluation, and the standard should be formulated according to statistical data; One believes that the subjective interpretation method is incomplete, one-sided, and cannot withstand repeated inspections, because the evaluation results may be different...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06T5/50
Inventor 董张玉王宗明刘殿伟任春颖汤旭光贾明明丁智
Owner NORTHEAST INST OF GEOGRAPHY & AGRIECOLOGY C A S
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products