Sciweavers

AAAI
1997
14 years 6 days ago
Structured Solution Methods for Non-Markovian Decision Processes
Markov Decision Processes (MDPs), currently a popular method for modeling and solving decision theoretic planning problems, are limited by the Markovian assumption: rewards and dy...
Fahiem Bacchus, Craig Boutilier, Adam J. Grove
AAAI
2004
14 years 8 days ago
Solving Generalized Semi-Markov Decision Processes Using Continuous Phase-Type Distributions
We introduce the generalized semi-Markov decision process (GSMDP) as an extension of continuous-time MDPs and semi-Markov decision processes (SMDPs) for modeling stochastic decisi...
Håkan L. S. Younes, Reid G. Simmons
ATAL
2007
Springer
14 years 2 months ago
Modeling plan coordination in multiagent decision processes
In multiagent planning, it is often convenient to view a problem as two subproblems: agent local planning and coordination. Thus, we can classify agent activities into two categor...
Ping Xuan
HICSS
2005
IEEE
139views Biometrics» more  HICSS 2005»
14 years 4 months ago
Complex Decision Making Processes: their Modelling and Support
Decision making processes and systems to support the same have focused for the most part on narrow disciplines, paradigms, perspectives, and pre-determined processes. Apart from t...
Angela Liew, David Sundaram
INFOCOM
2006
IEEE
14 years 4 months ago
To Peer or Not to Peer: Modeling the Evolution of the Internet's AS-Level Topology
— Internet connectivity at the AS level, defined in terms of pairwise logical peering relationships, is constantly evolving. This evolution is largely a response to economic, po...
Hyunseok Chang, Sugih Jamin, Walter Willinger
IROS
2009
IEEE
206views Robotics» more  IROS 2009»
14 years 5 months ago
Bayesian reinforcement learning in continuous POMDPs with gaussian processes
— Partially Observable Markov Decision Processes (POMDPs) provide a rich mathematical model to handle realworld sequential decision processes but require a known model to be solv...
Patrick Dallaire, Camille Besse, Stéphane R...