Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image interfusion method based on wave transform of not sub sampled contour

A technology of contourlet transformation and image fusion, applied in the field of image fusion

Inactive Publication Date: 2007-12-26
HUAZHONG UNIV OF SCI & TECH
View PDF0 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of method has the characteristics of simple structure and easy implementation, but the description of the target is often not achieved by a single pixel or pixels in a single window area, so the pixel-based fusion method has certain limitations for highlighting the target.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image interfusion method based on wave transform of not sub sampled contour
  • Image interfusion method based on wave transform of not sub sampled contour
  • Image interfusion method based on wave transform of not sub sampled contour

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] The present invention uses Nonsubsampled Contourlet Transform NSCT (Nonsubsampled ContourletTransform) to decompose the image in multiple scales, and then performs different fusion operations on different frequency bands. The processing flow is shown in Figure 1:

[0054] (1) The source images A and B are decomposed into low-frequency sub-images Y through non-subsampled contourlet transform respectively 0 A , Y 0 B and a series of high-frequency sub-images Y k A , Y k B , k=1, 2, ..., N, where N is the number of high-frequency sub-images.

[0055]The high-frequency part represents the detail component of the image, including the edge detail information of the source image. The number N of high-frequency sub-images is determined by the number of stages of pyramid decomposition in non-subsampling contourlet transform and the number of directions of directional filter decomposition; the low-frequency part Represents the approximate component of the image, containing...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An image fusing-method based on non-subsample outline wave transform includes carrying out multi-dimension decomposition on image by utilizing non-subsample outline wave transform to obtain high frequency sub-image and low frequency sub-image, using resolution selection fusion means to fuse high frequency sub-image and using relevant pixel grey scale selection regulation to fuse low frequency sub-image, applying non-subimage outline wave inverse transform on fusion results of low and high frequency sub-images for obtaining final fused result.

Description

technical field [0001] The invention relates to the field of image fusion, in particular to an image fusion method based on Nonsubsampled Contourlet Transform (NSCT). Background technique [0002] The so-called image fusion refers to combining the information of two or more source images to obtain a more accurate, comprehensive and reliable image description of the same scene. Image fusion makes full use of redundant information and complementary information contained in multiple images to be fused. The fused image should be more in line with the visual characteristics of humans or machines, so as to facilitate further image analysis, target detection, recognition or tracking. [0003] Image fusion is also divided into three levels from low to high: pixel-level fusion, feature-level fusion and decision-level fusion. Pixel-level fusion is to directly perform pixel-associated fusion processing under the condition of strict image registration. Feature-level fusion is based o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00
Inventor 曹治国王凯肖阳徐正翔邹腊梅马明刚谭颖
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products