Recently there has been interest in the use of classifiers based on the product of experts (PoE) framework. PoEs offer an alternative to the standard mixture of experts (MoE) framework. It may be viewed as examining the intersection of a series of experts, rather than the union as in the MoE framework. This paper presents a particular implementation of PoEs, the normalised product of Gaussians (PoG). Here each expert is a Gaussian mixture model. In this work, the PoG model is presented within a hidden Markov model framework. This allows the classification of variable length data, such as speech data. Training and initialisation procedures are described for this PoG system. The relationship of the PoG system with other schemes, including covariance modeling schemes, is also discussed. In addition the scheme is shown to be related to a standard speech recognition approach, multiple stream systems. The PoG system performance is examined on an automatic speech recognition task, Switchboar...
M. J. F. Gales, S. S. Airey