Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Moving object tracking method based on sample combination and depth detection network

A technology for deep detection and moving targets, which is applied in the field of image processing, can solve the problems of slow target recognition, target tracking failure, and consumption, and achieve the effects of shortening target detection time, fast target recognition speed, and overcoming a large amount of time consumption

Active Publication Date: 2019-02-22
XIDIAN UNIV
View PDF7 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method realizes the positioning of natural images, the disadvantage of this method is that the 300 suggested regions are mapped to the last layer of neural network to extract features, which consumes a lot of time and leads to the target recognition speed of this method. Slow, unable to meet the requirements of real-time tracking of moving targets
For the problem of target size change, the method collects samples of different sizes to train the feature network centered on the image target; for the problem of tracking failure caused by rapid movement of the target
The disadvantage of this method is that the method uses the decision network to estimate the target motion trajectory through the target motion information extracted by the feature network, and then predicts the target position. When a certain frame of image prediction deviates, the subsequent image prediction will be The accumulation of deviations leads to the failure of target tracking; although this method can achieve accurate tracking when the scale of the target changes, the disadvantage of this method is that because only samples of different sizes are collected to train the feature network, when the target undergoes severe deformation When , there will be a phenomenon that the judgment network is wrong, which makes the target tracking fail

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Moving object tracking method based on sample combination and depth detection network
  • Moving object tracking method based on sample combination and depth detection network
  • Moving object tracking method based on sample combination and depth detection network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention will be further described below in conjunction with the accompanying drawings.

[0035] Refer to attached figure 1 , to further describe the specific steps of the present invention.

[0036] Step 1, using the data augmentation method of sample combination to generate a training sample set.

[0037] Input the first frame video image in the color video image sequence containing the moving target to be tracked.

[0038] Add zero-value pixels to the upper, lower, left, and right edges of the first frame of the video image at the same time, increase 5 pixels each time, increase 100 times to generate 100 enlarged images, and form the enlarged images to form a small-scale sample set .

[0039] In the first frame of video image, a rectangular frame is determined with the center of the initial position of the moving target to be tracked as the center, and the length and width of the moving target to be tracked as the length and width, and the image insid...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a moving target tracking method based on a sample combination and a depth detection network. The realization steps of the invention are as follows: (1) generating a training sample set by using a data enhancement method of the sample combination; (2) setting the normalized label of the training sample set; (3) constructing depth detection network; (4) training depth detection network by using the training sample set; (5) inputting the color video image sequence containing the object to be tracked into the trained depth detection network in turn, and outputting the tracking coordinates of the moving object. The invention utilizes the data enhancement method of sample combination to generate the training sample set, trains the depth detection network, determines the position of the target to be tracked by using the confidence value of the alternative frame, and solves the problems of slow target recognition speed and inaccurate tracking when the target is deformedin appearance.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a moving target tracking method based on sample combination and deep detection network in the technical field of moving target tracking. The invention can be used for target tracking on videos of severe deformation, camera shake, scale change, illumination change and the like. Background technique [0002] The main task of target tracking is to realize the real-time detection of the target in the input video frame, and then determine the position of the target in real time. With the continuous deepening of people's understanding of the field of computer vision, object tracking has been widely used and developed in this field, and there are already a large number of tracking algorithms to realize moving object tracking. However, since video tracking only completes the feature learning of the target from the first frame of images, the lack of sample features leads t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06T7/73G06K9/62G06N3/04
CPCG06T7/246G06T7/73G06T2207/20081G06T2207/10016G06N3/045G06F18/24G06F18/214
Inventor 田小林李芳荀亮李帅焦李成
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products