Home Page

Papers

Submissions

News

Editorial Board

Special Issues

Open Source Software

Proceedings (PMLR)

Data (DMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

Kernel-based L_2-Boosting with Structure Constraints

Yao Wang, Xin Guo, Shao-Bo Lin; 26(296):1−37, 2025.

Abstract

Developing efficient kernel methods for regression is popular in the past two decades. In this paper, utilizing boosting on kernel-based weak learners, we propose a novel kernel-based learning algorithm called kernel-based re-scaled boosting with truncation, dubbed as KReBooT. The proposed KReBooT benefits in controlling the structure and producing sparse estimators, and is near overfitting resistant. We conduct both theoretical analysis and numerical simulations to illustrate the excellent performance of KReBooT. Theoretically, we prove that KReBooT can achieve the optimal numerical convergence rate for nonlinear approximation. Furthermore, using a variant of Talagrand's concentration inequality, we provide fast learning rates for KReBooT, which is a new record of boosting-type algorithms. Numerically, we carry out several simulations to show the promising performance of KReBooT in terms of its good generalization, near over-fitting resistance and structure constraints.

[abs][pdf][bib]       
© JMLR 2025. (edit, beta)

Mastodon