Sciweavers

COLT
2004
Springer

Convergence of Discrete MDL for Sequential Prediction

14 years 5 months ago
Convergence of Discrete MDL for Sequential Prediction
We study the properties of the Minimum Description Length principle for sequence prediction, considering a two-part MDL estimator which is chosen from a countable class of models. This applies in particular to the important case of universal sequence prediction, where the model class corresponds to all algorithms for some fixed universal Turing machine (this correspondence is by enumerable semimeasures, hence the resulting models are stochastic). We prove convergence theorems similar to Solomonoff’s theorem of universal induction, which also holds for general Bayes mixtures. The bound characterizing the convergence speed for MDL predictions is exponentially larger as compared to Bayes mixtures. We observe that there are at least three different ways of using MDL for prediction. One of these has worse prediction properties, for which predictions only converge if the MDL estimator stabilizes. We establish sufficient conditions for this to occur. Finally, some immediate consequences...
Jan Poland, Marcus Hutter
Added 01 Jul 2010
Updated 01 Jul 2010
Type Conference
Year 2004
Where COLT
Authors Jan Poland, Marcus Hutter
Comments (0)