Journal of Machine Learning Research, Volume 1

Robert C. Williamson, Editor

**
Ronan Collobert
collober@idiap.ch
IDIAP
CP 592, rue du Simplon 4
1920 Martigny, Switzerland
tel: +41 27 721 77 31
fax: +41 27 721 77 12
Samy Bengio
bengio@idiap.ch
IDIAP
CP 592, rue du Simplon 4
1920 Martigny, Switzerland
tel: +41 27 721 77 39
fax: +41 27 721 77 12**

Support Vector Machines (SVMs) for regression problems are trained by solving
a quadratic optimization problem which needs on the order of *l*^{2}
memory and time resources to solve, where *l* is the number of training
examples. In this paper, we propose a decomposition algorithm,
*SVMTorch*^{1}, which is
similar to *SVM-Light*
proposed by Joachims [5] for classification problems,
but adapted to regression problems.
With this algorithm, one can now efficiently solve
large-scale regression problems (more than 20000 examples).
Comparisons with *Nodelib*, another publicly available
SVM algorithm for large-scale regression problems
from Flake and Lawrence [3] yielded significant time improvements.
Finally, based on a recent paper from Lin [9], we show that a
convergence proof exists for our algorithm.

Journal of Machine Learning Research