Home Page

Papers

Submissions

News

Editorial Board

Proceedings

Open Source Software

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

Optimal Bounds between f-Divergences and Integral Probability Metrics

Rohit Agrawal, Thibaut Horel; 22(128):1−59, 2021.

Abstract

The families of $f$-divergences (e.g. the Kullback-Leibler divergence) and Integral Probability Metrics (e.g. total variation distance or maximum mean discrepancies) are widely used to quantify the similarity between probability distributions. In this work, we systematically study the relationship between these two families from the perspective of convex duality. Starting from a tight variational representation of the $f$-divergence, we derive a generalization of the moment-generating function, which we show exactly characterizes the best lower bound of the $f$-divergence as a function of a given IPM. Using this characterization, we obtain new bounds while also recovering in a unified manner well-known results, such as Hoeffding's lemma, Pinsker's inequality and its extension to subgaussian functions, and the Hammersley-Chapman-Robbins bound. This characterization also allows us to prove new results on topological properties of the divergence which may be of independent interest.

[abs][pdf][bib]       
© JMLR 2021. (edit, beta)