Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Template matching tracking method and system based on depth feature fusion

A technology of template matching and depth features, applied in the field of target tracking, can solve problems such as insufficient robustness, lack of real-time performance, and easy drift, etc., and achieve the effect of improving robustness and suppressing jitter and drift

Active Publication Date: 2021-01-29
CHINA AUTOMOTIVE INNOVATION CORP
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to factors such as occlusion, lighting, and target non-rigidity in actual scenes, there are still problems with the accuracy and robustness of target tracking.
[0003] In the existing technology, when using the depth feature stream to process images, the feature of filtering the static object to establish the feature of the moving object is adopted to propagate the feature of the key frame moving object to the current frame. This technology is prone to drift in the processing process, resulting in insufficient robustness.
At the same time, additional storage costs are required to obtain the depth frame and feature information database, and the feature information sets of all reference contours are calculated and matched one by one, resulting in real-time performance that cannot be satisfied in the actual application process.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Template matching tracking method and system based on depth feature fusion
  • Template matching tracking method and system based on depth feature fusion
  • Template matching tracking method and system based on depth feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention realizes the purpose of target tracking through a template matching tracking method based on deep feature fusion and a system for realizing the method. Below through embodiment, in conjunction with accompanying drawing, this scheme is described further in detail.

[0043] In this application, we propose a template matching tracking method based on deep feature fusion and a system for implementing the method, which includes a template matching tracking method based on deep feature fusion, as shown in the attached figure 1 Shown, be that the inventive method realizes flowchart, specifically divide into the following steps:

[0044] Step 1: Obtain video data, and input the first frame image of the video into the deep convolutional network; this step further preprocesses the acquired video data, specifically processing the size of the image to be input into the deep convolutional network into a deep convolutional network The acceptable size of the pro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a template matching tracking method and system based on depth feature fusion, and the method comprises the following steps: 1, obtaining video data, and inputting a first frameimage of a video into a depth convolution network; step 2, enabling the deep convolutional network to receive the image frame information, and perform feature extraction and output; 3, obtaining a target position estimated according to the feature information; 4, judging the type of the target, and correcting the target box according to the type information; step 5, extracting the next frame of video data and inputting the data into the deep convolutional network for feature extraction; 6, according to the feature map extracted in the fifth step, performing template matching through the targetfeatures of the last frame; 7, outputting a target position in the current frame; and 8, judging whether video reading is finished or not, finishing target tracking when a finishing condition is met,and skipping to the step 5 when the finishing condition is not met. According to the invention, the target tracking is realized by detecting the target and judging the position of the target in the video.

Description

technical field [0001] The invention relates to a template matching tracking method and system based on deep feature fusion, in particular to the technical field of target tracking. Background technique [0002] With the development of computer technology, detection methods based on deep learning gradually occupy a dominant position in the fields of target detection, classification, and segmentation. Due to factors such as occlusion, illumination, and target non-rigidity in actual scenes, the accuracy and robustness of target tracking still have problems. [0003] In the prior art, when using the depth feature stream to process images, the feature of filtering the still object to establish the feature of the moving object is adopted to propagate the feature of the key frame moving object to the current frame. This technology is prone to drift in the processing process, resulting in insufficient robustness. At the same time, obtaining the depth frame and feature information ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32G06K9/62G06N3/04G06N3/08G06T7/246
CPCG06N3/08G06T7/251G06T2207/10016G06T2207/20081G06T2207/20084G06V40/103G06V20/48G06V20/46G06V10/25G06V10/751G06V2201/07G06N3/045G06F18/22G06F18/253
Inventor 陈志轩
Owner CHINA AUTOMOTIVE INNOVATION CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products