Video multi-target tracking method using convolutional neural network and bidirectional matching algorithm

A convolutional neural network, multi-target tracking technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve the problems of long appearance feature extraction, difficult target tracking, tracking interruption, etc., and achieve many target tracking. effect, improve training speed, reduce the effect of tracking loss

Pending Publication Date: 2022-02-25
JIANGXI UNIV OF SCI & TECH
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the extraction of appearance features takes a long time, and it is difficult to achieve real-time target tracking
When the appearance feature is not used, although the tracking speed is greatly improved, when the matching algorithm encounters target occlusion or disappears briefly, it is easy to cause tracking interruption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video multi-target tracking method using convolutional neural network and bidirectional matching algorithm
  • Video multi-target tracking method using convolutional neural network and bidirectional matching algorithm
  • Video multi-target tracking method using convolutional neural network and bidirectional matching algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments. A video multi-target tracking method using convolutional neural network and bidirectional matching algorithm, its specific implementation steps are as follows:

[0029] (S1): Split the input video into video frames for model training and actual tracking. The training of the model needs to use the bounding box information with the target and the ID information of the target in the video, and the same ID in the video frame represents the same target. Model training needs to send the image information of the current frame and the past frame into the convolutional neural network. It also needs to calculate the center point position of the target through the bounding box information of the target, and generate a two-dimensional Gaussian distribution of the center point of the target according to the position of the center point. Generate a target cen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a video multi-target tracking method using a convolutional neural network and a bidirectional matching algorithm. The convolutional neural network based on the anchor-frame-free target detection method can provide the center point position of the target and the target scale information by fusing the feature information of different scales. According to the invention, on the basis of the convolutional neural network, a bidirectional motion prediction branch for the target is added, and a corresponding multi-target tracking matching algorithm is designed based on bidirectional motion information, so that bidirectional tracking matching of the target is completed through input of multi-frame video information; and tracking interruption caused by short-time shielding of the target is relieved through the design of a stranding region. In addition, an attention module is used in the convolutional neural network, so that the training time of the model is effectively shortened, and the use efficiency of the algorithm is improved. Experimental results on an MOT17 multi-target tracking data set show that the algorithm of the invention has high tracking accuracy.

Description

technical field [0001] The invention belongs to the field of computer vision and relates to digital image and video processing, target detection and multi-target tracking. Background technique [0002] The multi-target tracking task for video involves target detection tasks and target recognition tasks. Its purpose is to perform target detection on multiple objects in the screen by analyzing video images, maintain their IDs, and record their trajectories. Different from single target tracking, the performance of multi-target tracking algorithm does not decrease with the increase of the number of tracks, so it can be widely used in surveillance video analysis, real-time pedestrian trajectory tracking, vehicle and ship trajectory tracking and other scenarios. [0003] With the development of deep learning and the advancement of convolutional neural networks, the performance of target detection algorithms has been greatly improved, which directly promotes the progress of multi-...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06V10/74G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06T7/246G06N3/08G06T2207/20081G06T2207/20084G06N3/045G06F18/22
Inventor 曾泽华罗会兰
Owner JIANGXI UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products