Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hyperspectral Image Super-resolution Reconstruction Method Based on Coupling Dictionary and Spatial Transformation Estimation

A technology of hyperspectral image and space conversion, applied in the field of hyperspectral image super-resolution reconstruction based on coupling dictionary and space conversion estimation, can solve the problem of low reconstruction accuracy, achieve good super-resolution effect and reduce the effect of use limitation

Active Publication Date: 2019-10-22
NORTHWESTERN POLYTECHNICAL UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to overcome the deficiency of low reconstruction accuracy of existing hyperspectral image super-resolution reconstruction methods, the present invention provides a hyperspectral image super-resolution reconstruction method based on coupling dictionary and space transformation estimation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hyperspectral Image Super-resolution Reconstruction Method Based on Coupling Dictionary and Spatial Transformation Estimation
  • Hyperspectral Image Super-resolution Reconstruction Method Based on Coupling Dictionary and Spatial Transformation Estimation
  • Hyperspectral Image Super-resolution Reconstruction Method Based on Coupling Dictionary and Spatial Transformation Estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] The spatial resolution of hyperspectral images is very low. Simply using the super-resolution method promoted by true-color images cannot improve the resolution very effectively. Relatively speaking, true-color images are easier to obtain, and the main purpose of the present invention is to use true-color images in the same scene to improve the spatial resolution of hyperspectral images. Assuming that the hyperspectral image and the true color image that have been obtained and registered are respectively and And the target image is an image with high spatial resolution and spectral resolution Where L and l represent the number of bands of the hyperspectral image and true color image, w, h represent the width and height of the low-resolution hyperspectral image, and W, H represent the width and height of the high-resolution true color image. It is also assumed that n and N represent the number of pixels in the hyperspectral image and the true color image, n=w×h, N=W...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a super-resolution reconstruction method of hyperspectral image based on coupled dictionary and spatial transformation estimation, which is used for solving the existing technical problem of low precision reconstruction of the prior hyperspectral image super-resolution reconstruction method. The hyperspectral image is first linearly unmixed by a spectral unmixing theory. The corresponding spectral dictionary is obtained. The sparse representation theory is used to establish the hyperspectral reconstruction model of hyperspectral image based on coupled dictionary. The spatial transformation matrix between the image and the true color image is a regular term, and reduces the use limit of the algorithm. Then, the improved PALM algorithm is used for solving the model, and the hyperspectral image after super-resolution reconstruction is obtained. Though testing, the results show that the accuracy indexes such as the root-mean-square error RMSE and the spectral angle match SAM are higher than those of the background technology hyperspectral image super-resolution reconstruction method, and have better super-resolution effect when the spatial super-resolution is 32 times.

Description

technical field [0001] The invention relates to a hyperspectral image super-resolution reconstruction method, in particular to a hyperspectral image super-resolution reconstruction method based on coupling dictionary and space transformation estimation. Background technique [0002] The document "Hyperspectral and Multispectral Image Fusion Based on a SparseRepresentation[J].IEEE Transactions on Geoscience and Remote Sensing,2015,53(7):3658-3668." discloses a hyperspectral image hyperspectral image based on image fusion and sparse representation. The resolution reconstruction algorithm uses the online learning method to obtain the spectral dictionary of the hyperspectral image, and introduces sparse constraints into the traditional optimization framework, uses the SALSA schema to optimize the solution, and finally obtains the hyperspectral image with high spatial resolution. However, this method does not consider its actual physical meaning when obtaining the dictionary. Th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T3/40
Inventor 魏巍张艳宁李勇张磊
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products