Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Multi-target tracking method based on lstm network and deep reinforcement learning

A technology of multi-target tracking and enhanced learning, which is applied in the field of multi-target tracking based on LSTM network and deep reinforcement learning, can solve the problems of incomplete models and inaccurate tracking results, and achieve the goal of overcoming incomplete models, improving accuracy and multiple The effect of object tracking accuracy

Active Publication Date: 2020-08-11
HUAIYIN INSTITUTE OF TECHNOLOGY
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Purpose of the invention: In order to overcome the technical shortcomings of the artificially designed models in the prior art that are not comprehensive enough and the tracking results are not accurate enough, the present invention provides a multi-target tracking method based on LSTM network and deep reinforcement learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target tracking method based on lstm network and deep reinforcement learning
  • Multi-target tracking method based on lstm network and deep reinforcement learning
  • Multi-target tracking method based on lstm network and deep reinforcement learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0026] Such as figure 1 As shown, the multi-target tracking method based on LSTM network and deep reinforcement learning includes the following steps:

[0027] (1) Use the YOLO V2 target detector to detect each frame of the image in the video to be tested, and output the detection result, and set the detection result of the corresponding image at time t as a set is the jth detection result corresponding to the image at time t, and N is the total number of detections;

[0028] (2) if figure 2 As shown, multiple single-target trackers based on deep reinforcement learning technology are constructed, each single-target tracker includes a convolutional neural network CNN and a fully connected layer FC, and the convolutional neural network is built on the basis of the VGG-16 network , VGG-16 belongs to the state-of-the-art and has wide applica...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-target tracking method based on LSTM network and deep reinforcement learning. A target detector is used to detect each frame of image in the video to be tested, and the detection results are output to construct multiple single target tracking based on deep reinforcement learning technology. Each single target tracker includes a convolutional neural network and a fully connected layer. The convolutional neural network is built on the basis of the VGG-16 network, and outputs the tracking results of each single target tracker to calculate the similarity matrix associated with the data. The data association module is constructed based on the LSTM network, and the distribution probability vector obtained by inputting the similarity matrix is ​​the matching probability between the i-th target and the detection result j, and the obtained target detection result with the highest matching probability is used as the tracking result of the i-th target. The invention is not affected by mutual occlusion, similar appearance and constantly changing quantity in the multi-target tracking process, and improves the multi-target tracking accuracy and multi-target tracking accuracy.

Description

technical field [0001] The invention belongs to the field of computer vision and relates to a video multi-target tracking method, in particular to a multi-target tracking method based on LSTM network and deep reinforcement learning. Background technique [0002] Multi-target tracking is a hot issue in the field of computer vision and plays an important role in many application fields, such as: artificial intelligence, virtual reality, unmanned driving, etc. Despite a large number of related works in the early stage, multi-object tracking is still a challenging problem due to frequent occlusions, similar appearance of multiple objects, and constantly changing number of objects in the multi-object tracking process. [0003] In recent years, detection-based multi-target tracking methods have achieved some success. They divide multi-target tracking into two parts: multi-target detection and data association. The detection-based multi-target tracking method can solve the problem ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/20G06N3/04
CPCG06T7/20G06T2207/20084G06T2207/10016G06N3/045
Inventor 姜明新常波贾银洁
Owner HUAIYIN INSTITUTE OF TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products