My research focus is on using continuous state partially observable Markov decision processes (POMDPs) to perform object manipulation tasks using a robotic arm. During object mani...
In designing Markov Decision Processes (MDP), one must define the world, its dynamics, a set of actions, and a reward function. MDPs are often applied in situations where there i...
David L. Roberts, Sooraj Bhat, Kenneth St. Clair, ...
The heuristics used for planning and search often take the pattern databases generated from abstracted versions of the given state space. Pattern databases are typically stored p ...
Browsing a blog archive is currently not well supported. Users cannot gain an overview of a blog easily, nor do they receive adequate support for finding potentially interesting e...
As Internet services become ubiquitous, the selection and management of diverse server platforms now affects the bottom line of almost every firm in every industry. Ideally, such ...
Christopher Stewart, Terence Kelly, Alex Zhang, Ka...