Sciweavers

STOC
1997
ACM

Using and Combining Predictors That Specialize

14 years 3 months ago
Using and Combining Predictors That Specialize
Abstract. We study online learning algorithms that predict by combining the predictions of several subordinate prediction algorithms, sometimes called “experts.” These simple algorithms belong to the multiplicative weights family of algorithms. The performance of these algorithms degrades only logarithmically with the number of experts, making them particularly useful in applications where the number of experts is very large. However, in applications such as text categorization, it is often natural for some of the experts to abstain from making predictions on some of the instances. We show how to transform algorithms that assume that all experts are always awake to algorithms that do not require this assumption. We also show how to derive corresponding loss bounds. Our method is very general, and can be applied to a large family of online learning algorithms. We also give applications to various prediction models including decision graphs and “switching” experts.
Yoav Freund, Robert E. Schapire, Yoram Singer, Man
Added 07 Aug 2010
Updated 07 Aug 2010
Type Conference
Year 1997
Where STOC
Authors Yoav Freund, Robert E. Schapire, Yoram Singer, Manfred K. Warmuth
Comments (0)