A Gaussian mixture model (GMM) estimates a probability density function using the expectation-maximization algorithm. However, it may lead to a poor performance or inconsistency. This paper analytically shows that performance of a GMM can be improved in terms of Kullback-Leibler divergence with a committee of GMMs with different initial parameters. Simulations on synthetic datasets demonstrate that a committee of as few as 10 models outperforms a single model.