Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Two-frame Image Fusion Method under Different Illumination Based on Texture Information Reconstruction

A texture information and fusion method technology, applied in the field of image processing, can solve the problems of color cast in the fusion image, not considering the texture information components of the low-light image, and blurring of some areas of the fusion image, so as to improve the clarity and improve the brightness scene adaptability. , Overcome the effect of blurring in some areas

Active Publication Date: 2017-06-13
XIDIAN UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that it directly fuses the three color channels of red, green and blue rgb, resulting in a color shift in the final fused image; secondly, this method does not consider the texture information components in the low-light image, resulting in an Some areas are blurred

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-frame Image Fusion Method under Different Illumination Based on Texture Information Reconstruction
  • Two-frame Image Fusion Method under Different Illumination Based on Texture Information Reconstruction
  • Two-frame Image Fusion Method under Different Illumination Based on Texture Information Reconstruction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be further described below in conjunction with the accompanying drawings.

[0040] Reference attached figure 1 , further describe in detail the steps realized by the present invention.

[0041] Step 1, input the image to be fused.

[0042] Input one frame of images taken under flashlight and one frame without flashlight to be fused respectively.

[0043]Step 2, transform the channel of the image to be fused.

[0044] Use the following formula to transform a frame of image taken under flash conditions from the red, green and blue matrix into three matrices of brightness, red difference, and blue difference:

[0045] YF=0.2990×R f +0.5870×G f +0.1140×B f

[0046] CbF=-0.1687×R f -0.3313×G f +0.5000×B f +128

[0047] CrF=-0.5000×R f -0.4187×G f -0.0813×B f +128

[0048] Among them, YF, CbF, and CrF represent the brightness, red difference, and blue difference matrix of a frame of image captured under the flashlight condition after ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fusion method for two images of different illuminations based on texture information reconstruction. The method comprises the implementation steps of 1, inputting images to be fused, 2, converting a channel of the images to be fused, 3, reconstructing a brightness matrix, 4, adopting the weighted least square self-adapting filter to conduct filtering on the images to be fused, 5, obtaining a texture information matrix of the images to be fused, 6, reconstructing the texture information matrix, 7, fusing the images to be fused, 8, converting the matrix, and 9, outputting the fused images. According to the method, self-adapting boosting is performed on the fused images by means of texture information and brightness information of the two images of different brightnesses, the definition of the fused images is improved, the noise of the fused images is lowered, and the defect that an existing image fusion technology has the single brightness scene is overcome.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a fusion method of two frames of images under different illuminations based on reconstruction of texture information in the technical field of image definition enhancement. The present invention can be used in the fusion of one frame of high-brightness image taken under the condition of flash light and one frame of low-brightness image taken under the condition of no flash light obtained by taking pictures at night, while extracting the texture information of the image with higher brightness, while retaining the color of the image with lower illumination information to improve the clarity of the final fused image. Background technique [0002] At present, the existing image fusion methods mainly include spatial domain image fusion method and frequency domain image fusion method, and the most representative fusion methods include gradient fusion method in space doma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/50
Inventor 宋彬王博秦浩蒋国良陈鹏
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products