Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-target tracking method based on multi-model fusion and data association

A multi-target tracking and data association technology, which is applied in the fields of image processing, video detection and artificial intelligence cross-technology applications, and can solve the problems such as the failure of automatic recovery of the target reappearance, the inability to meet the requirements of real-time performance, and the inability to continue to track accurately. Achieve the effect of reducing the interference of light and background noise, good real-time performance and robustness, and fast processing speed

Active Publication Date: 2017-10-24
NANJING UNIV OF POSTS & TELECOMM
View PDF4 Cites 55 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] (1) Model-based target tracking: Firstly, it is necessary to obtain the prior information of the tracking target to model the structure and motion state of the target. Although it can achieve better results, if it cannot obtain enough information about the target, it will fail. The tracking effect deteriorates, and at the same time, it cannot meet the real-time requirements
[0004] (2) Target tracking based on target outline: Because of the robust invariance of the outline information, the outline of the object is used to represent the moving target, and it is continuously updated, which has strong anti-light transformation characteristics, but in the case of a relatively complex background There will be a tracking failure
[0005] (3) The method based on area tracking: establish a target template and track the target according to the target. When the moving target is blocked, it will not be able to continue to track accurately, and the target will not automatically recover when it reappears, causing the tracking algorithm to fail.
This algorithm can maintain good tracking when the target is occluded, but if the feature points of the target change, such as scaling, rotation, etc., the tracking effect will be affected

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target tracking method based on multi-model fusion and data association
  • Multi-target tracking method based on multi-model fusion and data association

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0043] According to attached figure 1 , the specific embodiment of the present invention is:

[0044] 1) Input a video sequence S={f 1 ,f 2 ,..., f 50}, f i It is the i-th shot frame, represented by a two-dimensional matrix with a size of 50*50, and the video shot S is processed by the inter-frame difference method to obtain the outline and centroid coordinates of the moving target. The specific steps are as follows:

[0045] 1.1) Take the f in the video lens S 1 , f 2 As an example, grayscale processing is performed to obtain the grayscale difference image f 1 ', f' 2 , for f 1 ', f' 2 For each pixel point j in , calculate D 2 (j)=f' 2 (j)-f 1 '(j), when D 2 (j) Satisfy the decision equation:

[0046] D. 2 (j) > T, judge j as the foreground point;

[0047] D. 2 (j)≤T, judge j as the background point.

[0048] Get moving target contour D 2 Then store the coordinates of its center point as the center of mass coordinates of the moving target in the Point type ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-target tracking method based on multi-model fusion and data association; the tracking method comprises the following steps: firstly using an interframe difference method to detect a motion target contour and center of mass coordinates; fusing a pyramid optical flow method with Kalman filtering so as to predict the center of mass coordinates of the motion target in the next moment; using Euclidean distances between the center of mass coordinate predicted value and the center of mass coordinate detection value at next moment to form a benefit matrix, and using a Hungary algorithm to obtain the optimal matching through data association; finally removing certain portion unable to satisfy requirements in a tracker, and building a tracking unit for non-assigned detections, thus realizing multi-target tracking. The tracking method can be less affected by light changes and background noise interferences, thus solving the tracking failures caused by target blocking or mutual interferences between targets, providing multi-target tracking accuracy, and providing well instantaneity and robustness.

Description

technical field [0001] The invention belongs to the application fields of image processing, video detection and artificial intelligence cross technology, and in particular relates to a multi-target tracking method based on multi-model fusion and data association. Background technique [0002] Multi-target tracking is a research hotspot and difficulty in the field of computer vision, and it has important application value in intelligent traffic control, intelligent video surveillance and other fields. Due to the complexity of the real environment, problems such as background noise and target occlusion need to be solved urgently. The current tracking algorithms are mainly used: model-based tracking, target contour-based tracking, region-based tracking and feature-based tracking. [0003] (1) Model-based target tracking: Firstly, it is necessary to obtain the prior information of the tracking target to model the structure and motion state of the target. Although it can achieve...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06T5/00
CPCG06T7/246G06T2207/10016G06T2207/20024G06T5/70
Inventor 季露陈志岳文静
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products