Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target tracking method based on deep learning and discriminant model training and memory

A target tracking and model training technology, applied in the fields of computer vision and pattern recognition, can solve problems such as hindering the improvement of the accuracy of target tracking algorithms, and achieve the effects of alleviating the imbalance of positive and negative samples, efficient offline network training, and efficient solutions.

Inactive Publication Date: 2020-10-23
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This essential difference in feature requirements seriously hinders the accuracy improvement of target tracking algorithms based on online discriminative model training that currently uses deep convolutional neural networks trained in target classification tasks for feature extraction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method based on deep learning and discriminant model training and memory
  • Target tracking method based on deep learning and discriminant model training and memory
  • Target tracking method based on deep learning and discriminant model training and memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] Preferred embodiments of the present invention are described below with reference to the accompanying drawings. Those skilled in the art should understand that these embodiments are only used to explain the technical principle of the present invention, and are not intended to limit the protection scope of the present invention.

[0052] It should be noted that, in the description of the present invention, the terms "first" and "second" are only for the convenience of description, rather than indicating or implying the relative importance of the devices, elements or parameters, so they should not be understood as important to the present invention. Invention Limitations.

[0053] figure 1 It is a schematic diagram of the main steps of the embodiment of the target tracking method based on deep learning and discriminant model training of the present invention. Such as figure 1 As shown, the target tracking method of this embodiment includes: an offline training phase an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of computer vision and pattern recognition, in particular to a target tracking method based on deep learning and discriminant model training and a memory, and aims to improve the positioning precision of target tracking. The tracking method comprises the following steps of: in an offline training stage, extracting sample frame features from a training image and atest image by using a depth feature extraction network, and calculating a sample frame label of the training image and a sample frame first label of the test image; utilizing a discriminant model solver to train to obtain a discriminant model according to the sample frame feature and the label of the training image; according to the sample frame feature of the test image, using the discriminant model to perform prediction to obtain a second label; calculating network prediction loss according to the second label and the first label so as to drive optimization learning of the deep feature extraction network; in an online tracking stage, using the trained depth feature extraction network in a target tracking algorithm based on online discriminant model training. According to the method of the invention, the positioning precision of target tracking is effectively improved.

Description

technical field [0001] The invention relates to the fields of computer vision and pattern recognition, in particular to a target tracking method and memory based on deep learning and discriminant model training. Background technique [0002] Visual object tracking is one of the hot research topics in computer vision, and it is also an important research direction in the application of pattern recognition based on computer vision. In a video sequence, given the state (position and size) of the target object of interest in the first frame, the target tracking algorithm needs to estimate the state of the target object in the entire video sequence. [0003] At present, the target tracking algorithm based on online discriminant model training has achieved a good balance of accuracy and speed on multiple public databases, so it has attracted extensive attention from researchers. Most of the existing target tracking algorithms of this type use deep convolutional neural networks (s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246
CPCG06T2207/10016G06T2207/20081G06T2207/20084G06T7/246
Inventor 陈盈盈郑林宇王金桥卢汉清
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products