Machine Learning with Data Dependent Hypothesis Classes
Adam Cannon, J. Mark Ettinger, Don Hush, Clint Scovel;
We extend the VC theory of statistical learning to
data dependent spaces of classifiers.
This theory can be viewed as a decomposition
of classifier design into two components;
the first component is a restriction to a data dependent
and the second is empirical risk minimization
within that class.
We define a measure of complexity for
data dependent hypothesis classes and
provide data dependent versions of
bounds on error deviance and estimation error.
We also provide
a structural risk minimization procedure
over data dependent hierarchies and prove consistency.
We use this theory to provide a framework for
studying the trade-offs between performance and
computational complexity in classifier design.
As a consequence we obtain
a new family of classifiers with dimension independent
performance bounds and efficient learning procedures.