Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Infrared ship video tracking method based on convolutional neural network

A convolutional neural network and network technology, applied in the field of infrared ship video tracking based on convolutional neural network, can solve the problems of few infrared target tracking algorithms, lack of infrared target tracking data sets, etc.

Pending Publication Date: 2021-04-23
BEIJING UNIV OF TECH
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although infrared cameras have been widely used in marine environments, there are few infrared target tracking algorithms based on deep learning, mainly because of the lack of data sets for infrared target tracking.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Infrared ship video tracking method based on convolutional neural network
  • Infrared ship video tracking method based on convolutional neural network
  • Infrared ship video tracking method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] An embodiment of the present invention provides a convolutional neural network-based infrared ship video tracking method. The present invention will be explained and elaborated below in conjunction with the accompanying drawings:

[0019] The data processing method is: write a program to extract each frame of infrared video (Figure 1(a)), the number of channels is 3, the pixel value ∈ [0,256], and the size is 256×256. Use the LableMe software to frame the ship target in each frame. The frame should fit the ship target as much as possible (Fig. 1(b)), and the labeling result is an xml file (Fig. 1(c)), and the file name is the same as the frame picture name. The label file should contain the coordinates of the upper left point and the lower right point of the frame selection target box.

[0020] Embodiment flow process of the present invention is as follows:

[0021] Step 1: Based on the feature extraction network of SiamRPN, add a multi-layer fusion network. The featu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an infrared ship video tracking method based on a convolutional neural network, and the method improves the model recognition efficiency through fusing three to five layers of a SiamRPN feature extraction network to increase the spatial information of output features. A spatial transformation network composed of a positioning network, a grid generator and a sampler is added to a search branch so as to rotate and zoom the feature map, and the influence of target rotation and zoom on the recognition rate is reduced. In addition, aiming at the problem of lack of an infrared ship video tracking data set, the invention also constructs a data set containing 3000 infrared ship target images. According to the method, more accurate infrared target features are extracted through the multi-layer fusion network and the spatial change network, so that the tracking accuracy is improved; the method is high in operability and expandability and suitable for infrared ship target tracking of the sea surface background.

Description

technical field [0001] The invention belongs to the technical field of image processing, in particular to an infrared ship video tracking method based on a convolutional neural network. Background technique [0002] With the development and popularization of video imaging equipment and the rapid development of storage devices in recent years, the demand for intelligent video applications in the fields of social life, industrial production, and public safety is becoming more and more urgent. To realize the understanding and analysis of video, it is necessary to know the position and trajectory of the target in the video. As a basic research content of computer vision, the visual target tracking task can estimate the position of the target in the subsequent video frame according to the target given in the initial frame. [0003] Most of the current research on visual tracking is based on visible light video, but visible light target tracking can only be carried out under the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246
CPCG06T7/246G06T2207/10016G06T2207/10048G06T2207/20081G06T2207/20084
Inventor 唐然刘兆英张婷李玉鑑
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products