Infrared and visible light image fusion method based on gray level co-occurrence matrix

A gray-level co-occurrence matrix and image fusion technology, which is applied in image enhancement, image analysis, image data processing, etc., can solve the problems of losing texture details and neglecting

Active Publication Date: 2020-10-23
HUAIYIN TEACHERS COLLEGE
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the practical application of NSCT, the texture information of the fusion image is often emphasized and the infrared target is ignored, or the texture details are lost due to the infrared target. These two important information cannot be taken into account at the same time.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Infrared and visible light image fusion method based on gray level co-occurrence matrix
  • Infrared and visible light image fusion method based on gray level co-occurrence matrix
  • Infrared and visible light image fusion method based on gray level co-occurrence matrix

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0140] Example 1: The "Camp" image used in this experiment is as follows Image 6 as shown, Image 6 (a) is a visible light image, Image 6 (b) is an infrared image. The experimental results are shown in Table 1:

[0141] Table 1 "Camp" image fusion evaluation index

[0142]

Embodiment 2

[0143] Example 2: The "Trees" image used in this experiment is as follows Figure 7 as shown, Figure 7 (a) is a visible light image, Figure 7 (b) is an infrared image. The experimental results are shown in Table 2:

[0144] Table 2 "Trees" image fusion evaluation index

[0145]

[0146] From the results in Table 1 and Table 2, it can be seen that compared with other 7 classical fusion algorithms, the fusion method of the present invention performs best in terms of objective evaluation indicators, especially the gray standard deviation (SD), spatial frequency (SF) And visual information fidelity (VIFF), these three indicators perform more prominently, significantly outperforming other methods. That is to say, the method of the present invention not only highlights the infrared target, but also takes into account the processing of texture details, which is more conducive to human eye observation, and the image quality after fusion is higher. The six objective evaluatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an infrared and visible light image fusion method based on a gray level co-occurrence matrix. The method includes: firstly, performing gray level co-occurrence matrix analysison an infrared source image to obtain an infrared target saliency map; secondly, carrying out non-subsampled contourlet transform (NSCT) on visible light and infrared source images, carrying out contrast-maintaining fusion on low-frequency sub-band images obtained by decomposition, and fusing high-frequency sub-band images by adopting an improved Gaussian difference method; and mapping the targetsaliency map to the fused low-frequency sub-band image; and finally, performing NSCT inverse transformation to obtain a final fused image. According to the invention, the infrared target saliency detection is carried out by using the texture analysis characteristic of the gray level co-occurrence matrix, the infrared target can be effectively extracted, rich detail information is reserved, and thequality of the fused image is improved. The objective evaluation indexes of the method provided by the invention are superior to those of existing classic image fusion methods such as wavelet transform and pyramid transform, and the method has very strong robustness.

Description

technical field [0001] The invention belongs to the technical field of multi-source image fusion, and in particular relates to an infrared and visible light image fusion method based on a gray scale co-occurrence matrix. Background technique [0002] Infrared and visible light image fusion technology is an important part of image fusion research. The infrared image is the radiation image processed by the sensor after the infrared radiation emitted by the target and the background, which can detect hidden or camouflaged targets. Visible light images record the characteristics of visible light reflection of objects, including a large number of details and texture information, which conform to the characteristics of human vision. [0003] The purpose of infrared and visible light image fusion is to obtain a complete image that contains rich detail information and can accurately reflect the infrared target. Therefore, this technology is widely used in night imaging equipment t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50
CPCG06T5/50G06T2207/10048G06T2207/10052G06T2207/20221
Inventor 谭惜姿郭立强
Owner HUAIYIN TEACHERS COLLEGE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products