Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image fusion method, and training method and device of image fusion model

An image fusion and model technology, applied in the field of computer vision, can solve the problems of inability to guarantee the fusion effect, affecting the quality of the output image, and losing details of the output image.

Pending Publication Date: 2022-03-01
HUAWEI TECH CO LTD
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current fusion method cannot guarantee the fusion effect, and a large number of details are lost in the output image, which affects the quality of the output image.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image fusion method, and training method and device of image fusion model
  • Image fusion method, and training method and device of image fusion model
  • Image fusion method, and training method and device of image fusion model

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0190] Example 1: Obtaining a color image and an infrared image based on a dichroic prism.

[0191] Such as Image 6 As shown, the dichroic prism includes a prism 6020 and a filter 6030 . The incident light received by the lens 6010 can be divided into visible light and near-infrared light by using a dichroic prism, and the visible light and near-infrared light are respectively imaged by two sensors, the color sensor 6040 and the near-infrared sensor 6050, and a color image and an infrared image are simultaneously obtained.

example 2

[0192] Example 2: Obtain color images and infrared images based on time-sharing and frame interpolation.

[0193] Such as Figure 7 As shown, the supplementary light control unit 7030 periodically turns on and off the infrared supplementary light unit 7010 to control the type of light transmitted from the lens 7020 to the surface of the sensor 7040, that is, visible light or infrared light, respectively. imaging. It should be understood that Figure 7 The infrared image shown in can also be a composite image of an infrared image and a color image. In the case of low illumination, the amount of information of the color image in the composite image is less, so the composite image can be used as the composite image in the embodiment of the present application. infrared image. Through the frame interpolation algorithm, the color image and the infrared image at the same moment are obtained. Frame interpolation refers to obtaining an image of an intermediate frame through image ...

example 3

[0194] Example 3: Obtain a color image and an infrared image based on an RGB-Near-infrared (NIR) sensor.

[0195] Such as Figure 8 As shown, using the design of RGB-NIR information sensor, in one imaging, color and infrared images are obtained at the same time.

[0196] The training method of the image fusion model and the image fusion method of the embodiment of the present application will be described in detail below with reference to the accompanying drawings.

[0197] Figure 9 is a schematic diagram of the image fusion device 600 of the embodiment of the present application. In order to better understand the method in the embodiment of the present application, the following Figure 9 A brief description of the functions of each module in the

[0198] Apparatus 600 may be a cloud service device or a terminal device, such as a computer, server, or other device with enough computing power to train a time series forecasting model, or a system composed of a cloud service...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an image fusion method and a training method and device of an image fusion model, and relates to the field of artificial intelligence, in particular to the field of computer vision. The image fusion method comprises the following steps: acquiring a color image to be processed, an infrared image and a background reference image, wherein the infrared image and the color image to be processed are shot for the same scene; and inputting the color image to be processed, the infrared image and the background reference image into the image fusion model for feature extraction, and performing image fusion based on the extracted features to obtain a fused image. The image quality of the fused image can be improved, and meanwhile it is guaranteed that the color of the fused image is accurate and natural.

Description

technical field [0001] The embodiments of the present application relate to the field of computer vision, and in particular to an image fusion method, a training method and a device for an image fusion model. Background technique [0002] Computer vision is an integral part of various intelligent / autonomous systems in various application fields such as manufacturing, inspection, document analysis, medical diagnosis, and military. What we need is the knowledge of the data and information of the subject being photographed. To put it figuratively, it is to install eyes (cameras / video cameras) and brains (algorithms) on computers to replace human eyes to identify, track and measure targets, so that computers can perceive the environment. Because perception can be thought of as extracting information from sensory signals, computer vision can also be thought of as the science of how to make artificial systems "perceive" from images or multidimensional data. In general, computer ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/00G06T5/50
CPCG06T5/50G06T2207/10004G06T2207/10024G06T2207/10048G06T2207/20221G06T5/70G06T2207/20081G06T2207/20084G06T2207/30196G06T2207/30232G06T2207/30236G06T5/73G06T5/60
Inventor 吴华珍许婷婷黄芝娟
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products