Sciweavers

CSL
2007
Springer

Partially observable Markov decision processes for spoken dialog systems

13 years 10 months ago
Partially observable Markov decision processes for spoken dialog systems
In a spoken dialog system, determining which action a machine should take in a given situation is a difficult problem because automatic speech recognition is unreliable and hence the state of the conversation can never be known with certainty. Much of the research in spoken dialog systems centres on mitigating this uncertainty and recent work has focussed on three largely disparate techniques: parallel dialog state hypotheses, local use of confidence scores, and automated planning. While in isolation each of these approaches can improve action selection, taken together they currently lack a unified statistical framework that admits global optimization. In this paper we cast a spoken dialog system as a partially observable Markov decision process (POMDP). We show how this formulation unifies and extends existing techniques to form a single principled framework. A number of illustrations are used to show qualitatively the potential benefits of POMDPs compared to existing techniques,...
Jason D. Williams, Steve Young
Added 13 Dec 2010
Updated 13 Dec 2010
Type Journal
Year 2007
Where CSL
Authors Jason D. Williams, Steve Young
Comments (0)