Sciweavers

ICML
2003
IEEE

Optimizing Classifier Performance via an Approximation to the Wilcoxon-Mann-Whitney Statistic

15 years 7 days ago
Optimizing Classifier Performance via an Approximation to the Wilcoxon-Mann-Whitney Statistic
When the goal is to achieve the best correct classification rate, cross entropy and mean squared error are typical cost functions used to optimize classifier performance. However, for many real-world classification problems, the ROC curve is a more meaningful performance measure. We demonstrate that minimizing cross entropy or mean squared error does not necessarily maximize the area under the ROC curve (AUC). We then consider alternative objective functions for training a classifier to maximize the AUC directly. We propose an objective function that is an approximation to the Wilcoxon-Mann-Whitney statistic, which is equivalent to the AUC. The proposed objective function is differentiable, so gradient-based methods can be used to train the classifier. We apply the new objective function to real-world customer behavior prediction problems for a wireless service provider and a cable service provider, and achieve reliable improvements in the ROC curve.
Lian Yan, Robert H. Dodier, Michael Mozer, Richard
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2003
Where ICML
Authors Lian Yan, Robert H. Dodier, Michael Mozer, Richard H. Wolniewicz
Comments (0)