Sciweavers

499 search results - page 54 / 100
» Model Minimization in Markov Decision Processes
Sort
View
CDC
2008
IEEE
118views Control Systems» more  CDC 2008»
14 years 3 months ago
A density projection approach to dimension reduction for continuous-state POMDPs
Abstract— Research on numerical solution methods for partially observable Markov decision processes (POMDPs) has primarily focused on discrete-state models, and these algorithms ...
Enlu Zhou, Michael C. Fu, Steven I. Marcus
FGR
2006
IEEE
205views Biometrics» more  FGR 2006»
14 years 2 months ago
Tracking Using Dynamic Programming for Appearance-Based Sign Language Recognition
We present a novel tracking algorithm that uses dynamic programming to determine the path of target objects and that is able to track an arbitrary number of different objects. The...
Philippe Dreuw, Thomas Deselaers, David Rybach, Da...
CAISE
2010
Springer
13 years 9 months ago
Beyond Process Mining: From the Past to Present and Future
Abstract. Traditionally, process mining has been used to extract models from event logs and to check or extend existing models. This has shown to be useful for improving processes ...
Wil M. P. van der Aalst, Maja Pesic, Minseok Song
ATAL
2003
Springer
14 years 1 months ago
Performance models for large scale multiagent systems: using distributed POMDP building blocks
Given a large group of cooperative agents, selecting the right coordination or conflict resolution strategy can have a significant impact on their performance (e.g., speed of co...
Hyuckchul Jung, Milind Tambe
AIPS
2007
13 years 11 months ago
Learning to Plan Using Harmonic Analysis of Diffusion Models
This paper summarizes research on a new emerging framework for learning to plan using the Markov decision process model (MDP). In this paradigm, two approaches to learning to plan...
Sridhar Mahadevan, Sarah Osentoski, Jeffrey Johns,...