Optimal Denoising in Score-Based Generative Models: The Role of Data Regularity
Eliot Beyler, Francis Bach; 26(293):1−48, 2025.
Abstract
Score-based generative models achieve state-of-the-art sampling performance by denoising a distribution perturbed by Gaussian noise. In this paper, we focus on a single deterministic denoising step, and compare the optimal denoiser for the quadratic loss, we name "full-denoising", to the alternative "half-denoising" introduced by Hyvärinen (2025). We show that looking at the performance in terms of distance between distributions tells a more nuanced story, with different assumptions on the data leading to very different conclusions. We prove that half-denoising is better than full-denoising for regular enough densities, while full-denoising is better for singular densities such as mixtures of Dirac measures or densities supported on a low-dimensional subspace. In the latter case, we prove that full-denoising can alleviate the curse of dimensionality under a linear manifold hypothesis.
[abs]
[pdf][bib]| © JMLR 2025. (edit, beta) |
