Distributed Kernel Ridge Regression with Communications

Shao-Bo Lin, Di Wang, Ding-Xuan Zhou.

Year: 2020, Volume: 21, Issue: 93, Pages: 1−38


Abstract

This paper focuses on generalization performance analysis for distributed algorithms in the framework of learning theory. Taking distributed kernel ridge regression (DKRR) for example, we succeed in deriving its optimal learning rates in expectation and providing theoretically optimal ranges of the number of local processors. Due to the gap between theory and experiments, we also deduce optimal learning rates for DKRR in probability to essentially reflect the generalization performance and limitations of DKRR. Furthermore, we propose a communication strategy to improve the learning performance of DKRR and demonstrate the power of communications in DKRR via both theoretical assessments and numerical experiments.

PDF BibTeX code