This paper addresses the problem of variable ranking for Support Vector Regression. The relevance criteria that we proposed are based on leave-one-out bounds and some variants and for these criteria we have compared different search space algorithms : recursive feature elimination and scaling factor optimization based on gradient descent. All these algorithms have been compared on some toy problems and realworld QSAR datasets. Results showed that the span estimate criterion optimized through gradient descent yields improved error rate with fewer variables and that an interesting alternative criterion when the number of variables is very large can be a criterion based only on the lagrangian multipliers of the Support Vector Regression problem. Key words: Support Vector Regression, Variable Selection, Kernel Methods