Sciweavers

89 search results - page 2 / 18
» Sample-Based Planning for Continuous Action Markov Decision ...
Sort
View
ICTAI
2000
IEEE
14 years 2 months ago
Building efficient partial plans using Markov decision processes
Markov Decision Processes (MDP) have been widely used as a framework for planning under uncertainty. They allow to compute optimal sequences of actions in order to achieve a given...
Pierre Laroche
ICML
2005
IEEE
14 years 11 months ago
Coarticulation: an approach for generating concurrent plans in Markov decision processes
We study an approach for performing concurrent activities in Markov decision processes (MDPs) based on the coarticulation framework. We assume that the agent has multiple degrees ...
Khashayar Rohanimanesh, Sridhar Mahadevan
UAI
2004
14 years 6 days ago
Solving Factored MDPs with Continuous and Discrete Variables
Although many real-world stochastic planning problems are more naturally formulated by hybrid models with both discrete and continuous variables, current state-of-the-art methods ...
Carlos Guestrin, Milos Hauskrecht, Branislav Kveto...
ECML
2005
Springer
14 years 4 months ago
Using Rewards for Belief State Updates in Partially Observable Markov Decision Processes
Partially Observable Markov Decision Processes (POMDP) provide a standard framework for sequential decision making in stochastic environments. In this setting, an agent takes actio...
Masoumeh T. Izadi, Doina Precup
ICRA
2007
IEEE
154views Robotics» more  ICRA 2007»
14 years 5 months ago
Oracular Partially Observable Markov Decision Processes: A Very Special Case
— We introduce the Oracular Partially Observable Markov Decision Process (OPOMDP), a type of POMDP in which the world produces no observations; instead there is an “oracle,” ...
Nicholas Armstrong-Crews, Manuela M. Veloso