Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unsupervised hyperspectral image blind fusion method and system based on spatial-spectral joint residual correction network

A hyperspectral image and residual error correction technology, which is applied in the field of remote sensing image processing, can solve the problems of height discomfort, difficulty in directly obtaining high spatial resolution hyperspectral images, and the accuracy of fusion results deviate from hyperspectral images, so as to improve the accuracy , the effect of good fusion results

Active Publication Date: 2022-02-25
NANJING UNIV OF SCI & TECH
View PDF11 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the hyperspectral image fusion based on deep learning still has the following problems: (1) The two parameters of the spatial downsampling operator and the spectral response matrix depend on the imaging device, and many super-resolution methods assume these two operators by simulating the information of the imaging device It is known that the greater the error between the assumed operator and the actual parameter information, the more the accuracy of the fusion result will deviate from the real hyperspectral image; (2) The image obtained by the imaging device often reduces the resolution of one dimension to improve the other. One-dimensional resolution, that is, it is difficult to directly obtain high-spatial-resolution hyperspectral images, so building a supervised training model does not meet actual needs; (3) Unknown spatial degradation operators and spectral response matrices are unknown. Supervised fusion is a highly ill-posed problem, because unsupervised training requires that the loss function does not use real high-spatial-resolution hyperspectral images, and can only evaluate image quality indirectly, such as using low-resolution images after output degradation and known How to use limited low-resolution data to train an unsupervised blind fusion network and design an effective loss function is a difficulty

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unsupervised hyperspectral image blind fusion method and system based on spatial-spectral joint residual correction network
  • Unsupervised hyperspectral image blind fusion method and system based on spatial-spectral joint residual correction network
  • Unsupervised hyperspectral image blind fusion method and system based on spatial-spectral joint residual correction network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to enable those skilled in the art to better understand the solutions of the present invention, the following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is an embodiment of a part of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0021] Such as figure 1 As shown, an unsupervised hyperspectral image blind fusion method based on a space-spectrum joint residual correction network of the present invention includes the following steps:

[0022] (1) Establish spatial degradation and spectral degradation models based on hyperspectral data:

[0023] Step 1. Obtain...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an unsupervised hyperspectral image blind fusion method and system based on a spatial-spectral joint residual correction network. The method comprises the following steps: building a degradation network structure of a hyperspectral image, in order to simulate a space and spectrum downsampling process; establishing a space and spectrum residual fusion network model, and using, a difference value between a low-resolution result obtained through the degradation model and training data, as input of a fusion network, namely fusing residual errors in space and spectrum dimensions to obtain a residual plot corresponding to the input data; correcting initialized data, and sending a corrected result into the degradation network and a spatial-spectral joint correction network for multiple iterations to improve the precision of a fusion result. According to the invention, a spatial-spectral joint correction network suitable for unsupervised hyperspectral image blind fusion is used, and the spatial-spectral joint correction network can obtain an error graph between an input hyperspectral image and a true value.

Description

technical field [0001] The invention belongs to the technical field of remote sensing image processing, and in particular relates to an unsupervised hyperspectral image blind fusion method and system based on a space-spectrum joint residual correction network. Background technique [0002] Hyperspectral image fusion is an important application direction in the field of hyperspectral image remote sensing. Hyperspectral image fusion is to use the rich spectral information in low spatial resolution hyperspectral images and the rich spatial information in high spatial resolution multispectral images to synthesize high spatial resolution hyperspectral image data, which can be used for subsequent more complex images. Processing tasks provide high-quality training sets. Due to the limitations of existing sensor hardware, it is difficult to directly acquire images with both high spatial resolution and high spectral resolution. Therefore, the collected data can be post-processed thr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V20/13G06V10/774G06V10/80G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/04G06N3/088G06F18/214G06F18/253
Inventor 徐洋王婷婷吴泽彬韦志辉
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products