A novel technique is presented to construct sparse regression models based on the orthogonal least square method with boosting. This technique tunes the mean vector and diagonal covariance matrix of individual regressor by incrementally minimizing the training mean square error. An efficient weighted optimization method is developed based on boosting to append regressors one by one in an orthogonal forward selection procedure. Experimental results obtained using this construction technique demonstrate that it offers a viable alternative to the existing state-of-art kernel modeling methods for constructing parsimonious regression models.
Sheng Chen, Xunxian Wang, David J. Brown