Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Low-illumination image target detection method based on image fusion and target detection network

A technology of target detection and image fusion, which is applied in the fields of image processing and machine vision, can solve problems such as difficult to accurately identify targets, decrease in accuracy, and poor distinction, and achieve the effect of improving detection accuracy

Pending Publication Date: 2021-03-12
NORTHWESTERN POLYTECHNICAL UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The former uses the window to traverse the entire image, extracts the image features in the window at each position, and finally uses the classifier to classify the features; the latter uses the method of saliency detection or targetness to extract the candidate area of ​​​​the target, avoiding the sliding window A large number of calculations, but due to the possibility of framing part of the target or poorly distinguishing adjacent targets, it has a certain impact on the detection results. Many methods have been proposed to improve this problem. For example, some literature proposes a more targeted area. Extraction method, which reduces the number of candidate windows and improves the quality of candidate windows
[0003] Although the current deep learning network has been greatly developed in the field of target detection, it is basically based on visible light images for target recognition. When the ambient light conditions are poor, the accuracy will drop sharply.
Although infrared images can make up for the impact of poor lighting conditions to a certain extent, it is difficult to accurately identify targets using infrared images alone for target recognition tasks due to the lack of texture and detail information in infrared images.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Low-illumination image target detection method based on image fusion and target detection network
  • Low-illumination image target detection method based on image fusion and target detection network
  • Low-illumination image target detection method based on image fusion and target detection network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] Now in conjunction with embodiment, accompanying drawing, the present invention will be further described:

[0047] 1. Perform NSSCT transformation on visible light images and infrared images respectively

[0048] Compared with visible light images, the gray value of infrared images is generally higher. Only relying on the transformation of image intensity and other spatial domain processing methods will generally cause the infrared image components in the fused image to be heavier. Therefore, in order to maintain edge details while improving The contrast of fusion image, the present invention has adopted the method for NSSCT transformation, specifically introduces as follows:

[0049] The NSSCT transform is processed on the basis of the shearlet transform by using a non-subsampling scale transformation and a non-subsampling direction filter, and has good translation invariance. Therefore, the NSSCT transform is an optimal approximation to sparse representations of ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a low-illumination image target detection method based on image fusion and a target detection network, and the method combines a non-subsampled shearlet transform image fusionalgorithm and an improved target detection network YOLOV4, and comprises the steps: 1) employing an infrared and low-illumination visible light image fusion technology, so that the information complementation can be performed on infrared and low-illumination visible light images, and texture information is highlighted while the contour definition is improved; 2) sending the fused image to an improved YOLOV4 target detection network for detection, and outputting target information under a low-illumination condition; and 3) in order to improve the feature extraction capability of the YOLOV4 network, replacing the residual blocks in the YOLOV4 backbone network with dense link blocks, wherein compared with the residual blocks, the dense link blocks enable the network to improve the feature expression capability and improve the feature extraction capability of the network; finally, experiments prove that the method can improve the target detection capability under the condition of low illumination.

Description

technical field [0001] The invention belongs to the field of machine vision and image processing, and relates to a low-illuminance image target detection method based on image fusion and target detection network. Background technique [0002] Target detection and recognition refers to the use of certain technical means to locate and identify the target of interest in the image. There are currently two main types of object detection methods, sliding window-based methods and region-based objectness-based methods. The former uses the window to traverse the entire image, extracts the image features in the window at each position, and finally uses the classifier to classify the features; the latter uses the method of saliency detection or targetness to extract the candidate area of ​​​​the target, avoiding the sliding window A large number of calculations, but due to the possibility of framing part of the target or poorly distinguishing adjacent targets, it has a certain impact ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V20/00G06V2201/07G06N3/045
Inventor 许悦雷加尔肯别克崔祺周清回天
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products