Sciweavers

ECML
2006
Springer
14 years 3 months ago
Bandit Based Monte-Carlo Planning
Abstract. For large state-space Markovian Decision Problems MonteCarlo planning is one of the few viable approaches to find near-optimal solutions. In this paper we introduce a new...
Levente Kocsis, Csaba Szepesvári