Sciweavers

ICPR
2010
IEEE

Boosting Bayesian MAP Classification

13 years 10 months ago
Boosting Bayesian MAP Classification
In this paper we redefine and generalize the classic k-nearest neighbors (k-NN) voting rule in a Bayesian maximum-a-posteriori (MAP) framework. Therefore, annotated examples are used for estimating pointwise class probabilities in the feature space, thus giving rise to a new instance-based classification rule. Namely, we propose to "boost" the classic k-NN rule by inducing a strong classifier from a combination of sparse training data, called "prototypes". In order to learn these prototypes, our MAPBOOST algorithm globally minimizes a multiclass exponential risk defined over the training data, which depends on the class probabilities estimated at sample points themselves. We tested our method for image categorization on three benchmark databases. Experimental results show that MAPBOOST significantly outperforms classic kNN (up to 8%). Interestingly, due to the supervised selection of sparse prototypes and the multiclass classification framework, the accuracy improv...
Paolo Piro, Richard Nock, Frank Nielsen, Michel Ba
Added 13 Feb 2011
Updated 13 Feb 2011
Type Journal
Year 2010
Where ICPR
Authors Paolo Piro, Richard Nock, Frank Nielsen, Michel Barlaud
Comments (0)