We want to solve the problem (3) taking into account
variables
only. To simplify the notation, let us define

and

as well as

The problem (3) is thus equivalent to minimizing

subject to

and

where again for and for .

Now let us suppose we can decompose each of the following variables
into two parts (after having reordered the variables accordingly): the first
part corresponds to variables
and the second part
corresponds to the fixed variables
:

and

Replacing these variables in (15), (16) and (17), and taking into account the fact that , the minimization problem is now

(removing the constants that depend only on ), subject to

and

where if the

Minimizing (18) under the constraints (19) and (20) can be realized using a constrained
quadratic optimizer, such as a conjugate gradient method with
projection or an interior point method [4].
Moreover, following Platt's idea in *SMO* [12],
if one fixes the size of the working set
to 2,
the problem can also be solved analytically.

This particular case is important because experimental results show that
it always gives the fastest convergence times. We explain it here
*because it is a different minimization problem* from the one proposed
by previous authors such as Smola and Schölkopf [14]; in fact, it
is easier because it only has 2 variables and not 4.

Let us again simplify the notation:

Minimizing (18) under the constraints (19) and (20) is thus equivalent to minimizing

subject to

and

We are searching for a minimum in (21) with respect to *z*_{1}along the line (22). By inserting (22) into
(21), and after some derivations, it is now equivalent to
minimizing

In the case

Let us now consider the constraints (22) and (23).
They force *z*_{1} to stay between *L* and *H* where

Thus, taking

and

the minimum of (21) under the constraints (22) and (23) is obtained at if . In the pathological case where , it is clear that the solution

and

is the minimum.