Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Moving target tracking method based on sample combination and deep detection network

A technology for deep detection and moving targets, which is applied in the field of image processing, can solve the problems of slow target recognition, target tracking failure, and consumption, and achieve the effects of shortening target detection time, fast target recognition speed, and overcoming a large amount of time consumption

Active Publication Date: 2021-09-03
XIDIAN UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method realizes the positioning of natural images, the disadvantage of this method is that the 300 suggested regions are mapped to the last layer of neural network to extract features, which consumes a lot of time and leads to the target recognition speed of this method. Slow, unable to meet the requirements of real-time tracking of moving targets
For the problem of target size change, the method collects samples of different sizes to train the feature network centered on the image target; for the problem of tracking failure caused by rapid movement of the target
The disadvantage of this method is that the method uses the decision network to estimate the target motion trajectory through the target motion information extracted by the feature network, and then predicts the target position. When a certain frame of image prediction deviates, the subsequent image prediction will be The accumulation of deviations leads to the failure of target tracking; although this method can achieve accurate tracking when the scale of the target changes, the disadvantage of this method is that because only samples of different sizes are collected to train the feature network, when the target undergoes severe deformation When , there will be a phenomenon that the judgment network is wrong, which makes the target tracking fail

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Moving target tracking method based on sample combination and deep detection network
  • Moving target tracking method based on sample combination and deep detection network
  • Moving target tracking method based on sample combination and deep detection network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention will be further described below with reference to the accompanying drawings.

[0035] Refer to the attached figure 1 , the specific steps of the present invention will be further described.

[0036] Step 1, using the data augmentation method of sample combination to generate a training sample set.

[0037] Input the first frame video image in the sequence of color video images containing the moving target to be tracked.

[0038] Add zero-value pixels to the four edges of the first frame of video image, 5 pixels each time, 100 times to generate 100 enlarged images, and the enlarged images form a small-scale sample set .

[0039] In the first frame of video image, a rectangular frame is determined with the center of the initial position of the moving target to be tracked as the center and the length and width of the moving target to be tracked as the length and width, and the image in the rectangular frame is used as the initial target image.

[...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a moving target tracking method based on sample combination and depth detection network. The implementation steps of the invention are: (1) generating a training sample set by using a data enhancement method of sample combination; (2) setting the normalization of the training sample set (3) Construct a depth detection network; (4) Use the training sample set to train the depth detection network; (5) Input the color video image sequence containing the target to be tracked into the trained depth detection network in turn, and output the moving target tracking coordinate. The invention uses the data enhancement method of sample combination to generate the training sample set, trains the depth detection network, uses the confidence value of the candidate frame to determine the position of the target to be tracked, and solves the problems of slow target recognition and inaccurate tracking when the target is deformed in appearance. question.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a moving target tracking method based on sample combination and depth detection network in the technical field of moving target tracking. The present invention can be used for target tracking of videos of violent deformation, lens shake, scale change, illumination change and the like. Background technique [0002] The main task of target tracking is to realize real-time detection of the target in the input video frame, and then to determine the location of the target in real time. With the continuous in-depth understanding of the field of computer vision, target tracking has been widely used and developed in this field. At present, there are a large number of tracking algorithms to achieve moving target tracking. However, since the video tracking only completes the feature learning of the target from the first frame of image, the lack of sample features will lead ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06T7/73G06K9/62G06N3/04
CPCG06T7/246G06T7/73G06T2207/20081G06T2207/10016G06N3/045G06F18/24G06F18/214
Inventor 田小林李芳荀亮李帅焦李成
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products