Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System and method for tracking and recognizing people

a system and recognition technology, applied in the field of system and recognition methods for tracking and recognizing people, can solve problems such as the difficulty of accurately tracking multiple people in any real-world scenario

Inactive Publication Date: 2013-05-30
GENERAL ELECTRIC CO
View PDF25 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent describes a system that can track and recognize people using computer vision technology. It works by learning the unique appearance of each person through a process called "online discriminative learning." This system can collect tracking samples from person trackers and use them to create models for different people. It can then use these models to recognize people without needing any prior knowledge of their identity. The system can also use weighted pairwise constraints and spectral clustering to improve accuracy. Overall, this system provides a way to efficiently and accurately track and recognize people using computer vision technology.

Problems solved by technology

However, various environmental challenges (e.g., harsh lighting conditions, cluttered backgrounds, etc.) may cause tracking errors making it difficult to accurately track multiple people in any real-world scenario.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and method for tracking and recognizing people
  • System and method for tracking and recognizing people
  • System and method for tracking and recognizing people

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0013]In the subsequent paragraphs, various aspects of identifying and tracking multiple people will be explained in detail. The various aspects of the present techniques will be explained, by way of example only, with the aid of figures hereinafter. The present techniques for identifying and tracking multiple people will generally be described by reference to an exemplary tracking and recognition system (e.g., trajectory-based tracking and recognition system) designated by numeral 10.

[0014]The tracking and recognition system 10 depicted in FIG. 1 is configured to track people despite tracking errors (e.g., temporary trajectory losses and / or identity switches) that may occur. These tracking errors may result in noisy data or samples that include spatiotemporal gaps. The tracking and recognition system 10 is configured to handle the noisy data to enable the recognition and tracking of multiple people. The tracking and recognition system 10 includes a tracking subsystem 12 and a compu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A tracking and recognition system is provided. The system includes a computer vision-based identity recognition system configured to recognize one or more persons, without a priori knowledge of the respective persons, via an online discriminative learning of appearance signature models of the respective persons. The computer vision-based identity recognition system includes a memory physically encoding one or more routines, which when executed, cause the performance of constructing pairwise constraints between the unlabeled tracking samples. The computer vision-based identity recognition system also includes a processor configured to receive unlabeled tracking samples collected from one or more person trackers and to execute the routines stored in the memory via one or more algorithms to construct the pairwise constraints between the unlabeled tracking samples.

Description

BACKGROUND[0001]Smart environments, such as an indoor office and / or living space with ambient intelligence, have been widely adopted in various domains. A prerequisite to taking advantage of the intelligent and context-aware services within these spaces is knowing people's locations and their spatiotemporal context with respect to the environment. Typically, person detectors and video-based person tracking systems with a tracking-by-detection paradigm may be utilized to determine people's location and their spatiotemporal context within the environment. For example, a multi-camera, multi-person tracking system may be utilized to localize and track individuals in real-time. However, various environmental challenges (e.g., harsh lighting conditions, cluttered backgrounds, etc.) may cause tracking errors making it difficult to accurately track multiple people in any real-world scenario.BRIEF DESCRIPTION[0002]In a first embodiment, a tracking and recognition system is provided. The syst...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
CPCG06K9/6232G06K9/00771G06T7/70G06V20/52G06V10/7715G06V40/10G06F18/23
Inventor YU, TINGTU, PETER HENRYGAO, DASHANAKBAY, KUNTER SEREFYAO, YI
Owner GENERAL ELECTRIC CO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products