We study maximum a posteriori probability model order selection for linear regression models, assuming Gaussian distributed noise and coefficient vectors. For the same data model, we also derive the minimum mean-square error coefficient vector estimate. The approaches are denoted BOSS (Bayesian order selection strategy) and BPM (Bayesian parameter estimation method), respectively. In their simplest form, both BOSS and BPM require a priori knowledge of the distribution of the coefficients. However, under the assumption that the coefficient variance profile is smooth, we derive "empirical Bayesian" versions of our algorithms which estimate the coefficient variance profile from the observations and thus require little or no information from the user. We show in numerical examples that the estimators can outperform several classical methods, including the well-known AICc and BIC for model order selection.
Yngve Selén, Erik G. Larsson