We will consider the problem of proving theoretical convergence guarantees for score-based generative models (SGMs), such as DDPMs, which are foundational to large-scale generative systems like DALL·E 2. We will prove that, given L2-accurate score estimates, SGMs can efficiently sample from a broad class of realistic data distributions without relying on restrictive assumptions like log-concavity or log-Sobolev-ness. The convergence rate scales polynomially with problem parameters and matches the best-known complexity bounds for Langevin diffusion discretization. Proving this convergence rate relies on a clever application of Girsanov's theorem, which is a celebrated theorem in stochastic calculus, which we will explore during this talk.
This talk will be based on results presented in the paper: Chen, S., Chewi, S., Li, J., Li, Y., Salim, A., & Zhang, A. R. (2022). Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions. (https://arxiv.org/abs/2209.11215)