We discuss the problem of overfitting of probabilistic neural networks in the framework of statistical pattern recognition. The probabilistic approach to neural networks provides a statistically justified subspace method of classification. The underlying structural mixture model includes binary structural parameters and can be optimized by EM algorithm in full generality. Formally, the structural model reduces the number of parameters included and therefore the structural mixtures become less complex and less prone to overfitting. We illustrate how recognition accuracy and the effect of overfitting is influenced by mixture complexity and by the size of training data set.