In this paper we redefine and generalize the classic k-nearest neighbors (k-NN) voting rule in a Bayesian maximum-a-posteriori (MAP) framework. Therefore, annotated examples are u...
Paolo Piro, Richard Nock, Frank Nielsen, Michel Ba...
For supervised and unsupervised learning, positive definite kernels allow to use large and potentially infinite dimensional feature spaces with a computational cost that only depe...
Sequential forward selection (SFS) is one of the most widely used feature selection procedures. It starts with an empty set and adds one feature at each step. The estimate of the ...
The objective of this paper is twofold. The first part provides further insight in the statistical properties of the Welch power spectrum estimator. A major drawback of the Welch m...
We present a framework for tracking rigid objects based on an adaptive Bayesian recognition technique that incorporates dependencies between object features. At each frame we fin...