— To obtain accurate modeling results, it is of primal importance to find optimal values for the hyperparameters in the Support Vector Regression (SVR) model. In general, we search for those parameters that minimize an estimate of the generalization error. In this study, we empirically investigate different performance measures found in the literature: k-fold cross-validation, the computationally intensive, but almost unbiased leave-oneout error, its upper bounds – radius/margin and span bound –, Vapnik’s measure, which uses an estimate of the VC dimension, and the regularized risk functional itself. For each of the estimates we focus on accuracy, complexity and the presence of local minima. The latter significantly influences the applicability of gradient-based search techniques to determine the optimal parameters.