Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A template matching tracking method and system based on deep feature fusion

A technology of template matching and depth features, applied in the field of target tracking, can solve problems such as insufficient robustness, lack of real-time performance, and easy drift, etc., and achieve the effect of improving robustness and suppressing jitter and drift

Active Publication Date: 2021-04-09
CHINA AUTOMOTIVE INNOVATION CORP
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to factors such as occlusion, lighting, and target non-rigidity in actual scenes, there are still problems with the accuracy and robustness of target tracking.
[0003] In the existing technology, when using the depth feature stream to process images, the feature of filtering the static object to establish the feature of the moving object is adopted to propagate the feature of the key frame moving object to the current frame. This technology is prone to drift in the processing process, resulting in insufficient robustness.
At the same time, additional storage costs are required to obtain the depth frame and feature information database, and the feature information sets of all reference contours are calculated and matched one by one, resulting in real-time performance that cannot be satisfied in the actual application process.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A template matching tracking method and system based on deep feature fusion
  • A template matching tracking method and system based on deep feature fusion
  • A template matching tracking method and system based on deep feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention realizes the purpose of target tracking through a template matching tracking method based on deep feature fusion and a system for realizing the method. Below through embodiment, in conjunction with accompanying drawing, this scheme is described further in detail.

[0043] In this application, we propose a template matching tracking method based on deep feature fusion and a system for implementing the method, which includes a template matching tracking method based on deep feature fusion, as shown in the attached figure 1 Shown, be that the inventive method realizes flowchart, specifically divide into the following steps:

[0044] Step 1: Obtain video data, and input the first frame image of the video into the deep convolutional network; this step further preprocesses the acquired video data, specifically processing the size of the image to be input into the deep convolutional network into a deep convolutional network The acceptable size of the pro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes a template matching and tracking method and system based on deep feature fusion, which is realized through the following steps: Step 1, acquire video data, and input the first frame image of the video into a deep convolution network; Step 2, deep convolution The network receives image frame information, performs feature extraction and outputs; step 3, obtains the target position estimated according to the feature information; step 4, judges the target category, and corrects the target frame according to the category information; step 5, extracts the next frame of video data and Input the deep convolutional network for feature extraction; step 6, according to the feature map extracted in step 5, use the target features of the previous frame to perform template matching; step 7, output the target position in the current frame; step 8, judge whether the video is read End, complete target tracking when the end condition is met, and jump to step five if not. The invention realizes target tracking by detecting the target and judging the position of the target in the video.

Description

technical field [0001] The invention relates to a template matching tracking method and system based on deep feature fusion, in particular to the technical field of target tracking. Background technique [0002] With the development of computer technology, detection methods based on deep learning gradually occupy a dominant position in the fields of target detection, classification, and segmentation. Due to factors such as occlusion, illumination, and target non-rigidity in actual scenes, the accuracy and robustness of target tracking still have problems. [0003] In the prior art, when using the depth feature stream to process images, the feature of filtering the still object to establish the feature of the moving object is adopted to propagate the feature of the key frame moving object to the current frame. This technology is prone to drift in the processing process, resulting in insufficient robustness. At the same time, obtaining the depth frame and feature information ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/32G06K9/62G06N3/04G06N3/08G06T7/246
CPCG06N3/08G06T7/251G06T2207/10016G06T2207/20081G06T2207/20084G06V40/103G06V20/48G06V20/46G06V10/25G06V10/751G06V2201/07G06N3/045G06F18/22G06F18/253
Inventor 陈志轩
Owner CHINA AUTOMOTIVE INNOVATION CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products