We explore a new Bayesian model for probabilistic grammars, a family of distributions over discrete structures that includes hidden Markov models and probabilistic context-free gr...
In this paper we evaluate a method for generating synthetic speech at high speaking rates based on the interpolation of hidden semi-Markov models (HSMMs) trained on speech data re...
Michael Pucher, Dietmar Schabus, Junichi Yamagishi
A stochastic model of stroke order variation is proposed and applied to the stroke-order free on-line Kanji character recognition. The proposed model is a hidden Markov model (HMM...
This paper considers a method for speech emotion recognition by a max-margin framework incorporating a loss function based on a well-known model called the Watson and Tellegen’s...
We describe a framework for inducing probabilistic grammars from corpora of positive samples. First, samples are incorporated by adding ad-hoc rules to a working grammar; subseque...