We have presented a new decomposition algorithm intended to efficiently solve
large-scale regression problems using SVMs. This algorithm followed
the same principles as those used by Joachims [5] in his
classification algorithm. Compared to previously proposed decomposition
algorithms for regression, we have proposed an original method to select
the variables in the working set. We have shown how to solve
analytically subproblems of size 2, as it is done in *SMO* [12].
An internal cache keeping part of the kernel matrix in memory
enables the program to solve large problems without the need to keep
quadratic resources in memory and without the need to recompute every
kernel evaluation, which leads to an overall fast algorithm.
We have also shown that there exists a convergence proof for our algorithm.
Finally, an experimental comparison with another algorithm has shown
significant time improvement for large-scale problems and training time
generally scaling slightly less than quadratically with respect to the
number of examples.

Journal of Machine Learning Research