Abstract. The convex optimisation problem involved in fitting a kernel probit regression (KPR) model can be solved efficiently via an iteratively re-weighted least-squares (IRWLS) approach. The use of successive quadratic approximations of the true objective function suggests an efficient approximate form of leave-one-out cross-validation for KPR, based on an existing exact algorithm for the weighted least-squares support vector machine. This forms the basis for an efficient gradient descent model selection procedure used to tune the values of the regularisation and kernel parameters. Experimental results are given demonstrating the utility of this approach.
Gavin C. Cawley