Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Twin network target tracking method based on different measurement criteria

A twin network, target tracking technology, applied in biological neural network model, image data processing, image enhancement and other directions, can solve the problems of tracking failure, similar target interference, unrobust change of target appearance, etc. Robust effects for tracking drift, target appearance changes

Active Publication Date: 2021-06-18
XIAN UNIV OF TECH
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a twin network target tracking method based on different measurement criteria, which solves the problem that the current twin network target tracking method is susceptible to interference from similar targets or is not robust to changes in target appearance, resulting in tracking failures

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Twin network target tracking method based on different measurement criteria
  • Twin network target tracking method based on different measurement criteria
  • Twin network target tracking method based on different measurement criteria

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0102] Step 1, select the AlexNet network pre-trained on the ImageNet dataset as the feature extraction network of the twin network

[0103] Table 1 Feature extraction network parameter table

[0104]

[0105]

[0106] Feature Extraction Network The parameters of are shown in Table 1, consisting of 5 convolutional layers and 2 pooling layers. The first two convolutional layers are followed by two max pooling layers. Add random deactivation layer and RELU nonlinear activation function after the first 4 convolutional layers

[0107] Step 2, obtain the tracking video, and manually select the area where the target is located on the first frame of the video. Let (x, y) be the coordinates of the center point of the target in the first frame, and m and n be the width and height of the target area, respectively. Taking the center point (x, y) of the target in the first frame as the center, intercept a square area with side length z_sz. The calculation formula of z_sz is a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a twin network target tracking method based on different measurement criteria. The method comprises the following specific steps: 1, selecting a feature extraction network; 2, acquiring a tracking video, and manually selecting an area where a target is located on a first frame of the video, and obtaining depth features of the template; 3, entering a subsequent frame, and obtaining depth features of a current frame search area by using a tracking target coordinate position and width and height of a previous frame; 4, similarity measurement being conducted on the template depth features and the depth features of the current frame search area through cosine similarity, and a response diagram being obtained; 5, performing similarity measurement on the template depth feature and the depth feature of the current frame search area by using an Euclidean distance to obtain a response diagram obtained by using an Euclidean distance measurement mode; and 6, performing weighted fusion on the two response diagrams, and determining the position of the target according to the maximum value on the response diagrams. The problems that a target tracking method based on a twin network is prone to being interfered by similar objects and is not robust to the appearance of the target are solved.

Description

technical field [0001] The invention belongs to the technical field of video single target tracking, and relates to a twin network target tracking method based on different measurement criteria. Background technique [0002] In the field of computer vision, object tracking has always been an important topic and research direction. The work of target tracking is to estimate the position, shape or occupied area of ​​the tracked target in the continuous video image sequence, and determine the target's movement speed, direction and trajectory and other motion information. Target tracking has important research significance and broad application prospects. It is mainly used in video surveillance, human-computer interaction, intelligent transportation, and autonomous navigation. [0003] The target tracking method based on the Siamese network is the mainstream of the current target tracking method. The main idea of ​​the Siamese network structure is to find a function that can m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06K9/00G06K9/62G06N3/04
CPCG06T7/246G06T2207/10016G06V20/46G06N3/045G06F18/22
Inventor 刘龙付志豪史思琦
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products