Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object Tracking Method and System Based on Triple Convolutional Network and Perceptual Interference Learning

A target tracking and convolutional network technology, applied in the field of target tracking research, can solve the problems of indistinguishable intra-class interference and inaccurate target tracking, and achieve the effect of enhancing the ability to distinguish intra-class interference, reducing drift, and improving accuracy

Active Publication Date: 2021-04-06
HUAZHONG UNIV OF SCI & TECH
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the above defects or improvement needs of the prior art, the present invention provides a target tracking method and system based on triple convolutional network and perceptual interference learning, thereby solving the existing problems in the prior art that are difficult to distinguish between intra-class interference and target tracking. Track inaccurate technical issues

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object Tracking Method and System Based on Triple Convolutional Network and Perceptual Interference Learning
  • Object Tracking Method and System Based on Triple Convolutional Network and Perceptual Interference Learning
  • Object Tracking Method and System Based on Triple Convolutional Network and Perceptual Interference Learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055]In order to make the objectives, technical solutions and advantages of the present invention, the present invention will be described in further detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely intended to illustrate the invention and are not intended to limit the invention. Further, the technical features according to each of the various embodiments described below can be combined with each other as long as they do not constitute a collision between each other.

[0056]Such asfigure 1 As shown, a target tracking method based on triple convolutional network and perceived interference learning, by increasing the first frame branch and joining a perceptual interference learning, the network robustness can be stronger, and the accuracy of tracking is improved. Include the following steps:

[0057]Step 1: Prerequisites Target Tracking Tracking Data Set: The data set is a VID dataset. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method and system based on triple convolutional network and perceptual interference learning, which belongs to the field of target tracking research in image processing and machine vision. The method includes: inputting the video to be tracked into the triple convolutional network to obtain target tracking Results; the training of the triple convolutional network includes: constructing the triple convolutional network, obtaining the positive sample pair and the negative sample pair from the data set to obtain the training set; using the training set to train the triple convolutional network, two images of each sample pair in the training set Input the template branch and the detection branch respectively, or input the first frame branch and the detection branch respectively; the template branch and the first frame branch respectively extract the appearance model feature map, and compare the two appearance model feature maps with the feature map of the detection branch respectively. Cross-correlate to get two response maps; respectively calculate the loss of the two response maps for backpropagation, and thus obtain a trained triple convolutional network. The method of the invention has higher target tracking accuracy.

Description

Technical field[0001]The present invention belongs to the target tracking research in image processing and machine vision, and more particularly to a target tracking method and system based on triple convolutional network and perceived interference learning.Background technique[0002]As an important research direction in the field of computer visual, the main task of target tracking is to accurately and reliably predict the location and size of the target in the case where the target initial position is given to the target.[0003]There are many challenging factors in the current target tracking algorithm, primarily divided into internal factors and external factors. The intrinsic factor is the change of the target itself, such as the rapid movement, rotation, deformation, etc. of the target. The external factors are mainly the changes in the external environment, such as the goal being partially or completely occluded, and the light changes in the target area are severe.[0004]Target T...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/20G06T7/70
CPCG06T7/20G06T2207/20081G06T2207/20084G06T7/70
Inventor 韩守东夏鑫鑫夏晨斐黄飘
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products