Sciweavers

ICASSP
2008
IEEE

Bringing diverse classifiers to common grounds: dtransform

14 years 6 months ago
Bringing diverse classifiers to common grounds: dtransform
Several classification scenarios employ multiple independently trained classifiers and the outputs of these classifiers need to be combined. However, since each of the trained classifiers exhibit different statistical characteristics, it is not appropriate to combine them using techniques that are blind to these differences. We propose a transform, dtransform, that transforms outputs of classifiers to approximate posterior probabilities, and caters to the statistical behavior of the classifier while doing so. The transformed outputs are now comparable, and can be combined using any of the classical combination rules. We show convincing results that demonstrate the effectiveness of the proposed transform in providing better estimates of the posterior probabilities as compared to standard transformations, as demonstrated by lower KL distance from the true distribution, higher classification accuracies and higher effectiveness of the standard classifier combination rules.
Devi Parikh, Tsuhan Chen
Added 30 May 2010
Updated 30 May 2010
Type Conference
Year 2008
Where ICASSP
Authors Devi Parikh, Tsuhan Chen
Comments (0)