Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target tracking method based on local feature learning

A target tracking and local feature technology, applied in the field of computer vision, can solve the problems of poor adaptability of the update mechanism of the model and weak feature expression ability

Active Publication Date: 2016-06-15
SOUTH CHINA AGRI UNIV
View PDF4 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the deficiencies of the prior art, that is, the features extracted by the target tracking method are weak in expression ability and the update mechanism of the model is poor in adaptability, etc., the present invention proposes a tracking method based on local feature learning. Firstly, the target object and The background is decomposed into a large number of local units with scale and shape invariance, that is, the local area of ​​​​the image, and it is used as the training sample of the target and background classification model, and the target object and the background are learned from the training samples by using deep learning. local expression of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method based on local feature learning
  • Target tracking method based on local feature learning
  • Target tracking method based on local feature learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further described below in conjunction with the accompanying drawings, but the embodiments of the present invention are not limited thereto.

[0053] The schematic diagram of the execution steps of the method of the present invention is as attached figure 1 As shown, it specifically includes the following steps:

[0054] S1. Given the first frame image I and its corresponding target area, extract all pixels in the target area:

[0055] P={(x,y)|x 1 ≤x≤x 1 +w 1 ,y t ≤y≤y t + h t}

[0056] where x 1 and y 1 Indicates the horizontal and vertical coordinates of the target area, w 1 and h 1 Denote the width and height of the target area, respectively, and P is the set of all pixels in the target area. At the same time, extract all pixels within the background region:

[0057] N={(x,y)|x1}∪{(x,y)|x>x 1 +w 1}∪{(x,y)|y1}∪{(x,y)|y>y 1 + h 1}

[0058] Here, N represents the non-target area, that is, all pixel point sets in the backg...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target tracking method based on local feature learning. A target object and background are resolved into a large number of local units having dimension and shape invariance which serve as training samples of a target and background classification model, and a mode of deep learning is adopted to learn local expression of the target object and background from the training samples. The degree of confidence that each specific area in an image belongs to the target object is judged, thereby realizing accurate positioning of the target object. Since local expression obtained by learning of a large number of samples has a high target identification capability, the tracking method has relatively high adaptability to situations such as target deformation and target occlusion. When an object appearance model is updated, local areas with a relatively high target confidence degree are extracted as training samples of the model, and the model is updated. In the process of target tracking, the method continuously updates the appearance model, learns key characteristics of the target object, and can obtain a relatively good tracking effect under the circumstance that an appearance change is relatively big.

Description

technical field [0001] The present invention relates to the field of computer vision, and more specifically, relates to a target tracking method based on local feature learning. Background technique [0002] As an important research direction in the field of computer vision, object tracking has received extensive attention. This technology has broad application prospects in the fields of security monitoring, unmanned driving and military defense. Although there are already a considerable number of object tracking methods, these methods are often unstable or even invalid under the conditions of illumination changes, object deformation and severe occlusion. Therefore, proposing an effective target tracking algorithm has important application value and practical significance. [0003] At present, many target tracking algorithms have been put into use, and the complete target tracking algorithm can be divided into four main parts: the method of feature extraction, the establis...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06T7/20
CPCG06F18/24
Inventor 王美华梁云麦嘉铭
Owner SOUTH CHINA AGRI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products