An important theoretical tool in machine learning is the bias/variance decomposition of the generalization error. It was introduced for the mean square error in [3]. The bias/variance decomposition includes the concept of the ave-rage predictor. The bias is the error of the average predictor, and the systematic part of the generalization error, while the variability around the average predictor is the variance. We present a large group of error functions with the same desirable properties as the bias/variance decomposition in [3]. The error functions are derived from the exponential family of distributions via the statistical deviance measure. We prove that this family of error functions contains all error functions decomposable in that manner. We state the connection between the bias/variance decomposition and the ambiguity decomposition [7] and present a useful approximation of ambiguity that is quadratic in the ensemble coefficients. 1 Notation and problem domain The problem domain...