Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Probabilistic boosting tree framework for learning discriminative models

Inactive Publication Date: 2007-03-08
SIEMENS MEDICAL SOLUTIONS USA INC
View PDF2 Cites 43 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The task of classifying / recognizing, detecting and clustering general objects in natural scenes is extremely challenging.
Let x be an image sample and its interpretation be y. Ideally, the generative models p(x|y) are obtained for a pattern to measure the statistics about any sample x. Unfortunately, not only are such generative models often out of reach, but they also create large computational burdens in the computing stage.
Yet, there is no existing generative model which captures all the variations for a face such as multi-view, shadow, expression, occlusion, and hair style.
However, there are several problems with the current AdaBoost method.
First, though it asymptotically converges to the target distribution, it may need to pick hundreds of weak classifiers.
This poses a huge computational burden.
Second, the order in which features are picked in the training stage is not preserved.
Third, the re-weighing scheme of AdaBoost may cause samples previously correctly classified to be misclassified again.
Fourth, though extensions from two-class to multi-class classification have been proposed, learning weak classifiers in the multi-class case using output coding is more difficult and computationally expensive.
The AdaTree method learns a strong classifier by combining a set of weak classifiers into a tree structure, but it does not address multi-class classification.
However, pushing positives to the right side may cause a big false positive rate, especially when the positives and negatives are hard to separate.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Probabilistic boosting tree framework for learning discriminative models
  • Probabilistic boosting tree framework for learning discriminative models
  • Probabilistic boosting tree framework for learning discriminative models

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The present invention is directed to a probabilistic boosting tree framework for computing two-class and multi-class discriminative models. In the learning stage, the probabilistic boosting tree (PBT) automatically constructs a tree in which each node combines a number of weak classifiers (e.g., evidence, knowledge) into a strong classifier or conditional posterior probability. The PBT approaches the target posterior distribution by data augmentation (e.g., tree expansion) through a divide-and-conquer strategy.

[0032] In the testing stage, the conditional probability is computed at each tree node based on the learned classifier which guides the probability propagation in its sub-trees. The top node of the tree therefore outputs the overall posterior probability by integrating the probabilities gathered from its sub-trees. Also, clustering is naturally embedded in the learning phase and each sub-tree represents a cluster of a certain level.

[0033] In the training stage, a tree...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A probabilistic boosting tree framework for computing two-class and multi-class discriminative models is disclosed. In the learning stage, the probabilistic boosting tree (PBT) automatically constructs a tree in which each node combines a number of weak classifiers (e.g., evidence, knowledge) into a strong classifier or conditional posterior probability. The PBT approaches the target posterior distribution by data augmentation (e.g., tree expansion) through a divide-and-conquer strategy. In the testing stage, the conditional probability is computed at each tree node based on the learned classifier which guides the probability propagation in its sub-trees. The top node of the tree therefore outputs the overall posterior probability by integrating the probabilities gathered from its sub-trees. In the training stage, a tree is recursively constructed in which each tree node is a strong classifier. The input training set is divided into two new sets, left and right ones, according to the learned classifier. Each set is then used to train the left and right sub-trees recursively.

Description

CROSS REFERENCE TO RELATED APPLICATION [0001] This application claims the benefit of U.S. Provisional Application Ser. No. 60 / 660,136, filed on Mar. 9, 2005, which is incorporated by reference in its entirety.FIELD OF THE INVENTION [0002] The present invention is directed to a probabilistic boosting tree framework for learning discriminative models, and more particularly, to a probabilistic boosting tree framework for computing two class and multi-class discriminative models. BACKGROUND OF THE INVENTION [0003] The task of classifying / recognizing, detecting and clustering general objects in natural scenes is extremely challenging. The difficulty is due to many reasons: large intra-class variation and inter-class similarity, articulation and motion, different lighting conditions, orientations / viewing directions, and the complex configurations of different objects. FIG. 1 shows a multitude of different images. The first row 102 of FIG. 1 displays some face images. The rest of the rows ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N20/00G06V10/774
CPCG06K9/6256G06K2209/05G06N7/005G06N99/005G06T7/0044G06T2207/30201G06T2207/10072G06T2207/10132G06T2207/20132G06T2207/30044G06T2207/30048G06T7/0048G06T7/74G06T7/77G06N20/00G06V2201/03G06V10/774G06N7/01G06F18/214
Inventor TU, ZHUOWENBARBU, ADRIAN
Owner SIEMENS MEDICAL SOLUTIONS USA INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products