Convolutional neural network-based target tracking method and apparatus

A convolutional neural network and target tracking technology, applied in image data processing, instruments, calculations, etc., can solve the problems of inability to distinguish different individuals, low robustness, and limited tracking performance, so as to overcome the inability to distinguish Different individuals in the same object, improve computational efficiency, and avoid the effect of overfitting

Active Publication Date: 2017-06-13
明见(厦门)技术有限公司
View PDF7 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, a problem existing in the current tracking algorithm is that the high-level semantic information of the target cannot be obtained by using low-level artificially designed features, and in different video sequences, the texture and movement mode of different targets are different. Not very robust to situations such as occlusions, thus limiting tracking performance
In recent years, convolutional neural network (CNN) has achieved great success in image classification. The classification performance of convolutional neural network is applied to tracking problems. Through deep convolutional neural network training, it is possible to learn the performance of the target object. The high-level semantic features produced by the method solve the shortcomings of artificially designed features, but the general convolutional neural network cannot distinguish different individuals in the same type of objects very well, and it is easy to cause tracking errors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network-based target tracking method and apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] To further illustrate the various embodiments, the present invention is provided with accompanying drawings. These drawings are a part of the disclosure of the present invention, which are mainly used to illustrate the embodiments, and can be combined with related descriptions in the specification to explain the operating principles of the embodiments. With reference to these contents, those skilled in the art should understand other possible implementations and advantages of the present invention. The present invention will be further described in conjunction with the accompanying drawings and specific embodiments.

[0029] Due to different environments, such as changes in lighting conditions, blur caused by motion, or changes in scale, the texture features of objects still show some commonality, but there are differences among different objects. Therefore, the convolutional neural network designed by an embodiment of the present invention includes two parts, which ar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a convolutional neural network-based target tracking method and apparatus. The method comprises the steps of constructing a convolutional neural network, wherein the convolutional neural network comprises a plurality of feature sharing layers and a specific classification layer; performing offline training: for different video sequences with target labels, performing training by adopting different specific classification layers and the feature sharing layers, and updating network parameters, until the network is converged or a preset iterative frequency is reached, wherein the network parameters include parameters of the specific classification layers and parameters of the feature sharing layers; and updating the convolutional neural network, obtaining an optimal region from a video frame by using the convolutional neural network, and setting the optimal region as a tracking region. According to the method and the apparatus, the deficiency of artificial features is overcome, the defect of a general convolutional neural network in a tracking problem is improved, and the robustness of a target tracking algorithm is improved.

Description

technical field [0001] The present invention relates to the field of computer vision, in particular to a method and device for object tracking based on a convolutional neural network. Background technique [0002] The high incidence of traffic accidents has caused great personal and property safety problems in today's society, which makes the research and application of advanced driver assistance systems (ADAS) more and more popular, among which the anti-collision system occupies a very important position. Accurate tracking of dangerous objects (such as automobiles, bicycles, pedestrians, etc.) is the guarantee of driving safety. [0003] Object tracking is a fundamental problem in computer vision. Most of the current target tracking algorithms use artificially designed features, generate feature models for the target, find the area that best matches the target feature model within the search range, and update the model, which can be transformed into a binary classification...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/277
Inventor 谢超陈从华陈海沯叶德焰任赋林雅
Owner 明见(厦门)技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products