Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Space-time correlation target re-identification method and system

A spatiotemporal correlation and re-identification technology, applied in the field of target re-identification, can solve the problems of not considering the real-time motion state of the target, staying in the drawing of GIS maps, and the phase and time of the target movement trajectory being out of synchronization.

Active Publication Date: 2018-11-06
SHANGHAI JIAO TONG UNIV
View PDF7 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] After searching the existing technologies, it is found that although the current target re-identification technology is widely used in relay tracking, many of the target re-identification modules only use the visual feature information of the target in the image
For example, the patent number CN201210201004.7, the patent name: Intelligent Visual Sensing Network Moving Target Relay Tracking System Based on GPS and GIS, although it combines GPS and GIS information, that is, spatial information for screening, but for the utilization of time and space information, only Staying at the stage of drawing GIS maps and target movement trajectories, there is no technology to directly use time information and spatial information to improve the accuracy of target re-identification. The target re-identification module is still only based on visual features, and it may still be due to The similarity of visual features produces a large number of candidate targets with low probability at unreasonable time intervals
[0005] In addition, although there are patents that combine target time and location information, such as application (patent) No. CN201610221993.4, it is a target re-identification method based on space-time constraints. This method assigns a minimum movement time to each pair of adjacent cameras , and according to the Weber distribution and the measured appearance time of the candidate target, the probability of the target appearing at this moment is given, and the joint probability distribution is given by combining the visual matching features, but this method does not consider the real-time motion state of the target. The probability of target appearance at a specific moment in the method is completely dependent on the measurement time, and there are two main problems: first, the time description based on this patent is the local time of each camera's own clock, and it is not described as unified global positioning timing information, so it is possible Because the clocks of different cameras are not synchronized, the time is not synchronized, which directly affects the entire prediction result
In addition, more importantly, this patent only considers common issues such as time span and spatial distance, and does not consider individual issues such as target displacement direction and speed: directly specify the shortest movement time for adjacent cameras, and the amount of information reflected by this value is comparable to GIS information. There is no essential difference between the path distances of adjacent cameras, and it is only a manifestation of the spatial distance constraint; in actual situations, there are individual differences in speed and other characteristics between different targets, and some targets move faster and some slower. To accurately estimate the time when the target appears in the camera's field of view, it is necessary to combine the real-time motion information of the target to calculate and predict the possible arrival time of different targets
For example, if there are two targets with similar visual characteristics in the field of view of camera A, and they want to move to camera B, one is tracking the target and the other is not. According to the algorithm of this patent, the probability distribution of the two targets entering time is completely Consistent Weibull distribution, but if one of the two targets moves fast and the other is slow, then their time to reach B will be quite different, and the conclusion obtained by the method provided by the application (patent) No. CN201610221993.4 is completely different same, leading to imprecise selection of candidate targets
[0006] In further searches, no target re-identification method has been found that combines visual features, space-time constraints, and target individual movement information, and uses global unified timing for time synchronization

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Space-time correlation target re-identification method and system
  • Space-time correlation target re-identification method and system
  • Space-time correlation target re-identification method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several modifications and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0042] The present invention combines the pixel motion rate of the target in the video data to estimate the probability distribution of the duration of the target crossing two adjacent cameras with a certain distance in each segment of video data; The target is screened and pre-processed to filter out candidate targets that exceed the reasonable spanning time interval, reducing the probability of similar targets being mistakenly matched as tracking targets.

[0043] Specifically, in the embodiment...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a space-time correlation target re-identification method. The method, in combination with the pixel motion rate of a target in video data, probability distribution of the duration of the target crossing two fixed-distance adjacent cameras in each video data is estimated; based on the duration probability, candidate targets appearing in the video can be firstly screened andpreprocessed, candidate targets exceeding the reasonable crossing time interval are screened, and the probability that similar targets are mismatched to be tracking targets is reduced. The inventionfurther relates to a space-time correlation target re-identification system. The method is advantaged in that the generated matching result is constrained by the space-time position and the target motion information, compared with an original unconstrained matching structure only relying on visual features, re-identification accuracy can be effectively improved.

Description

technical field [0001] The present invention relates to a target re-identification technology, specifically, a time-space correlated target re-identification method and a corresponding target re-identification system. Background technique [0002] The target re-identification problem is to use computer vision technology to judge whether there is a specific target in an image or video sequence. Specifically, when using video to track a specific target, since the video source comes from a fixed position, cross-video relay tracking needs to be performed when the target leaves the field of view. At this time, the problem of detecting the specific target in other video sources belongs to Object Re-identification Problem. [0003] The target re-identification problem uses the target visual features acquired in the image to perform feature matching and give possible candidate targets. Due to the similarity of features between different targets, it is also possible that the candid...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/08
CPCG06N3/08G06V20/10
Inventor 张重阳孔熙雨归琳
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products