In this paper, the expectation-maximization (EM) algorithm for Gaussian mixture modeling is improved via three statistical tests. The first test is a multivariate normality criterion based on the Mahalanobis distance of a sample measurement vector from a certain Gaussian component center. The first test is used in order to derive a decision whether to split a component into another two or not. The second test is a central tendency criterion based on the observation that multivariate kurtosis becomes large if the component to be split is a mixture of two or more underlying Gaussian sources with common centers. If the common center hypothesis is true, the component is split into two new components and their centers are initialized by the center of the (old) component candidate for splitting. Otherwise, the splitting is accomplished by a discriminant derived by the third test. This test is based on marginal cumulative distribution functions. Experimental results are presented against seve...