Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cross-modal fusion target tracking method

A target tracking, cross-modal technology, applied in the field of computer information, can solve the problems of huge differences in cross-modal targets and difficult to obtain better results.

Active Publication Date: 2021-08-27
SICHUAN UNIV
View PDF11 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem that the difference between cross-modal targets is too large, and it is difficult to obtain better results by simply using feature-based matching

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cross-modal fusion target tracking method
  • Cross-modal fusion target tracking method
  • Cross-modal fusion target tracking method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, and are not intended to limit the present invention, that is, the described embodiments are only some of the embodiments of the present invention, but not all of the embodiments. The components of the embodiments of the invention generally described and illustrated in the figures herein may be arranged and designed in a variety of different configurations.

[0036] Accordingly, the following detailed description of the embodiments of the invention provided in the accompanying drawings is not intended to limit the scope of the claimed invention, but merely represents selected embodiments of the invention. Based on the embodiments of the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of computer information, and provides a cross-modal fusion target tracking method. The objective of the invention is to solve the problem that the difference between cross-modal targets is too large and a good effect is difficult to obtain by purely using feature-based matching. According to the main scheme, the method comprises the steps of constructing and generating an adversarial neural network composed of a pixel alignment module, a feature alignment module and a joint discrimination module, training on a data set to generate the adversarial network, extracting to-be-recognized targets from videos collected by different cameras, inputting the to-be-recognized targets into the trained joint discrimination module, and obtaining the feature similarity between the target and all the to-be-recognized targets; training a logistic regression model for predicting the time similarity between the targets according to the transfer time by using the transfer time data set of the marked targets between the cameras, and calculating the time similarity between the two targets by using the model; and adding the feature similarity and the time similarity to obtain total similarity, and taking the target pair with the highest total similarity as the same target.

Description

technical field [0001] The invention relates to the field of computer information technology, and provides a cross-modal fusion target tracking method. Background technique [0002] An RGB image has three channels containing color information for visible light, while an IR image has one channel containing information for invisible light. Therefore, even for humans, it is difficult to recognize people well by using color information. To address this problem, existing cross-modal re-ID methods mainly focus on bridging the gap between RGB and IR images through feature alignment, such as figure 2 shown. The basic idea is to match real RGB and IR images through feature representation learning. Due to the large cross-modal difference between the two modalities, it is difficult to directly match RGB and IR images in a shared feature space. [0003] Different from the existing methods by directly matching RGB and IR images, the heuristic method is to generate a pseudo-IR image b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/048G06N3/045G06F18/22G06F18/241
Inventor 左劼杨勇郭际香魏骁勇
Owner SICHUAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products