Sciweavers

AAAI
2004
14 years 1 months ago
Solving Concurrent Markov Decision Processes
Typically, Markov decision problems (MDPs) assume a single action is executed per decision epoch, but in the real world one may frequently execute certain actions in parallel. Thi...
Mausam, Daniel S. Weld