Multi-target tracking method based on deep neural network

A deep neural network and multi-target tracking technology, applied in the field of machine vision, can solve problems such as difficult to use, unable to track objects to be tracked continuously, complex calculations, etc., to achieve the effect of improving accuracy and reducing IDSwitch indicators

Pending Publication Date: 2021-12-03
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF11 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the application of deep neural network has improved the performance of target detection and recognition tasks, it has problems such as large parameter scale and complex calculation, and has high requirements for computing resources and storage resources. It is effectively and widely used in mobile devices with limited resources and limitations such as mobile phones and vehicle-mounted devices.
In addition, when performing multi-target recognition and tracking, the appearance information of the target to be tracked cannot be integrated into the association and matching process. When the target to be tracked is blocked, it is easy to cause false detection and frequent ID jumps, and it is impossible to truly realize the tracking of the target to be tracked. continuous tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target tracking method based on deep neural network
  • Multi-target tracking method based on deep neural network
  • Multi-target tracking method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0039] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0040] The present invention provides a multi-target tracking method based on deep neural network, referring to figure 1 , including the following steps:

[0041] S101. Collect a video to be tested, perform prepr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-target tracking method based on a deep neural network, and the method comprises the following steps: collecting a to-be-tested video, preprocessing the to-be-tested video, and extracting an original image frame of the to-be-tested video; performing target detection on each original image frame, identifying a to-be-tracked target, and obtaining a target detection frame of each original image frame; matching the target detection frames in two continuous frames of images on the time axis, calculating the similarity of to-be-tracked targets in the target detection frames, comparing the similarity of the to-be-tracked targets in the two continuous frames of images on the time axis, judging whether the to-be-tracked targets are the same to-be-tracked target or not, and if yes, allocating an ID and outputting a tracking result; if not, carrying out matching and judgment again; and realizing continuous tracking of multiple targets of the video based on the ID and the tracking result. The motion features and the appearance features are fused into the loss matrix calculation process, the prediction accuracy of the next frame of target is improved, and the ID Switch index is reduced, so that continuous tracking of the target is truly realized.

Description

technical field [0001] The invention relates to the technical field of machine vision, in particular to a multi-target tracking method based on a deep neural network. Background technique [0002] Early recognition and detection of images mainly relied on the extraction of hand-designed visual feature descriptors (such as color, shape, edge). However, this traditional artificial design method is based on the prior knowledge of existing data sets, and its limitations Large, the coverage and inclusiveness of real-world objects are small, and it is not enough to find salient objects in complex scenes or accurately delineate the boundaries of objects, and it is difficult to achieve satisfactory performance and effects. [0003] The problem of multi-target tracking first appeared in radar signal detection. With the in-depth research in the field of computer vision and the continuous improvement of the accuracy of target detection algorithms, multi-target tracking algorithms based...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06N3/04G06N3/08
CPCG06T7/246G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06N3/045
Inventor 邢建川蒋芷昕孔渝峰张栋卢胜陈洋周春文杨明兴
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products