Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object and scene fusion method based on IHS (Intensity, Hue, Saturation) transform

A technology that integrates methods and objects, applied in the field of non-real-time visual simulation technology, and can solve the problems of strict registration accuracy, long processing time, and high complexity

Inactive Publication Date: 2012-05-02
SHANGHAI UNIV
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, its high complexity, long processing time, and strict requirements for registration accuracy make it less widely used in remote sensing, a field that requires fast interactive processing and real-time visualization, than general standard solutions.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object and scene fusion method based on IHS (Intensity, Hue, Saturation) transform
  • Object and scene fusion method based on IHS (Intensity, Hue, Saturation) transform
  • Object and scene fusion method based on IHS (Intensity, Hue, Saturation) transform

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0080] Embodiment 1: This IHS transformation-based object and scene fusion method uses standard IHS transformation technology to perform brightness fusion, which is actually a brightness replacement operation. In this embodiment, standard IHS transformation technology is used as a brightness fusion tool to realize object and scene fusion Specific steps are as follows:

[0081] ①IHS transformation: perform IHS transformation on the scene image Image1 and the scene image Image2 added with the target object respectively to obtain Image1_hsi and Image2_hsi, and take their brightness components I1 and I2;

[0082] ② Use the histogram of I1 as the standard histogram, and perform histogram matching on I2 to obtain I2'. This is done to weaken the influence of the fusion process on the spectral information of the original scene image;

[0083] ③Brightness fusion: Replace the brightness component of the Image1_hsi image with the I2' obtained above, that is, make I1'=I2';

[0084] ④ IH...

Embodiment 2

[0087] Embodiment 2: In remote sensing image fusion applications, the biggest advantage of SFIM technology compared to standard IHS transformation technology and Brovey transformation technology is that it not only improves the fusion ability of spatial information, but also better preserves the spectral characteristics of source images. The resulting fused image is independent of the spectral properties of the high-resolution image. But SFIM is more sensitive to the registration accuracy of images. However, the images to be fused Image1 and Image2 processed by this IHS-based object and scene fusion method do not have registration accuracy problems, and they can be considered to be 100% registered, so SFIM technology can be introduced as a brightness fusion tool. Using it to achieve brightness fusion can be described by the following formula:

[0088]

[0089] Among them, I1 and I2 are the brightness components of the new scene image and the new scene image with the target...

Embodiment 3

[0092] Embodiment 3: The detailed information extracted from an image by wavelet transform can be fused into another image through many other methods, such as the simplest replacement, superposition, etc. At the same time, in most remote sensing image fusion applications, wavelet-based schemes can achieve better fusion results than general standard schemes without wavelets, especially in minimizing color distortion. Based on this, wavelet technology is introduced in this embodiment. By introducing a weighted model, in this embodiment, we provide an implementation step of using wavelet technology as a brightness fusion tool to realize object and scene fusion as follows:

[0093] ① Perform IHS transformation on the scene image Image1 and the scene image Image2 added with the target object respectively to obtain Image1_hsi and Image2_hsi, and take their brightness components I1 and I2;

[0094] ② Use the histogram of I1 as the standard histogram, and perform histogram matching o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an object and scene fusion method based on IHS (Intensity, Hue, Saturation) transform. The method comprises the following steps: firstly using the IHS transform and the intensity fusion to form a light mask of an image; and then achieving the aim of the fusion of the object and the scene by recovering object details in the light mask. The invention provides three schemes by respectively using a standard IHS transform technology, an SFIM (Smoothing Filter-based Intensity Modulation) technology and a wavelet technology as an intensity fusion tool. In the method, a to-be-fused image is free from rectification, less fusion conditions are needed, the method can be realized easily, the fusion of the object and the scene can be realized by only considering the intensity fusion; the method has fast fusion speed and great potential in the video processing, and can be used for maximally reducing the color distortion generated in fusion process; and meanwhile, the method has wider use since different intensity fusion algorithms can be implanted to form different fusion schemes.

Description

technical field [0001] The invention relates to an object and scene fusion method based on IHS (Intensity-Hue-Saturation color space, brightness-hue-saturation color space) transformation, belongs to the field of visualization technology, and is a non-real-time visualization simulation technology. Background technique [0002] Object and scene fusion is an application direction of image fusion. It refers to the segmentation of the target object of interest from its original scene, and then synthesizes it into another scene through superposition, combination and processing. New object scene images have to look real and natural to create new image effects. Object-scene fusion is widely used in the field of image editing, especially in the process of film and television production, many shots cannot be obtained through field shooting, such as embedding the behavior of actors in reality into a virtual world, these shots can be used Object scene fusion technology to achieve. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T11/00G06T7/00
Inventor 丁友东魏小成
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products