Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual multi-target tracking method and device based on deep learning

A multi-target tracking and deep learning technology, applied in the field of visual multi-target tracking methods and devices based on deep learning, can solve problems such as large amount of calculation, poor tracking effect, inability to achieve real-time tracking, etc., and achieve accurate tracking and computational complexity. low effect

Pending Publication Date: 2020-05-15
CRRC IND INST CO LTD
View PDF9 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the multi-target tracking methods disclosed in the prior art generally have a large amount of calculation and cannot realize real-time tracking, resulting in poor tracking effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual multi-target tracking method and device based on deep learning
  • Visual multi-target tracking method and device based on deep learning
  • Visual multi-target tracking method and device based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0022] figure 1 It is a flowchart of a deep learning-based visual multi-target tracking method provided by an embodiment of the present invention. Such as figure 1 As shown, the method includes:

[0023] Step 101, according to the frame sequence of the video, sequentially obtain the candidate detection frames of the tracking target in the curre...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a visual multi-target tracking method and a device based on deep learning, and the method comprises the steps: sequentially obtaining the candidate detection frames of a tracking target in a current video frame through a target detection network model, recording the coordinate position information, and obtaining a corresponding template image; acquiring each frame of image in the video except the first frame as a to-be-searched area image; and inputting each template image and the to-be-searched region image into a target tracking network model constructed by a twin convolutional neural network, thereby obtaining a tracking result of a tracking target. According to the visual multi-target tracking method and the device based on deep learning, the template image corresponding to each tracking target and the to-be-searched area image which are acquired by using the target detection network model are respectively input into the target tracking network model constructed by the twin convolutional neural network, so that the tracking result of the tracking target corresponding to the template image is acquired, the calculation amount is low, and multi-target real-time and accurate tracking is realized.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a deep learning-based visual multi-target tracking method and device. Background technique [0002] Visual object tracking is a hot issue in the field of computer vision research. With the rapid development of computer technology, object tracking technology has also made great progress. With the rapid rise of artificial intelligence in recent years, the research of object tracking technology has received more and more attention. [0003] Deep learning technology has powerful feature representation capabilities, and has achieved better results than traditional methods in applications such as image classification, object recognition, and natural language processing, so it has gradually become the mainstream technology for image and video research. The tracking method based on deep learning is an important branch of the target tracking method. It takes advantage of the end-...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06N3/04
CPCG06T7/251G06N3/045
Inventor 田寅温博阁唐海川咸哓雨李欣旭
Owner CRRC IND INST CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products