Sciweavers

NIPS
2004

The Power of Selective Memory: Self-Bounded Learning of Prediction Suffix Trees

14 years 26 days ago
The Power of Selective Memory: Self-Bounded Learning of Prediction Suffix Trees
Prediction suffix trees (PST) provide a popular and effective tool for tasks such as compression, classification, and language modeling. In this paper we take a decision theoretic view of PSTs for the task of sequence prediction. Generalizing the notion of margin to PSTs, we present an online PST learning algorithm and derive a loss bound for it. The depth of the PST generated by this algorithm scales linearly with the length of the input. We then describe a self-bounded enhancement of our learning algorithm which automatically grows a bounded-depth PST. We also prove an analogous mistake-bound for the self-bounded algorithm. The result is an efficient algorithm that neither relies on a-priori assumptions on the shape or maximal depth of the target PST nor does it require any parameters. To our knowledge, this is the first provably-correct PST learning algorithm which generates a bounded-depth PST while being competitive with any fixed PST determined in hindsight.
Ofer Dekel, Shai Shalev-Shwartz, Yoram Singer
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2004
Where NIPS
Authors Ofer Dekel, Shai Shalev-Shwartz, Yoram Singer
Comments (0)