We consider stochastic approximations of sampling algorithms like Langevin Monte Carlo (pathwise approximation via random batches) and Stein Variational Gradient Descent (approximation in the space of distributions). These algorithms are heavily deployed in Bayesian inference, and the physical sciences.
We first consider pathwise approximation in Stochastic Gradient Langevin Dynamics (SGLD), we show that the noise induced by the random batches is approximately Gaussian (due to the Central Limit Theorem) while the Brownian motion driving the algorithm is exactly Gaussian. We utilize this structure to provide improved guarantees for sampling algorithms under significantly weaker assumptions. We then propose covariance correction, which rescales the brownian motion to approximately remove the random batch error. We show that covariance corrected algorithms enjoy even better convergence.
We then consider stochastic approximation in the space of probability distributions to obtain a new particle discretization of Stein Variational Gradient Descent (SVGD), an interacting particle based sampling algorithm. We introduce and analyze Virtual Particle SVGD (VP-SVGD), which enjoys provably rapid convergence to the target. Our rates provide a double exponential improvement over the prior state of the art convergence results for SVGD under mild conditions, giving us the first provably fast variant of SVGD.
Based on joint work with Aniket Das (Google) and Anant Raj (INRIA and UIUC)