University of Birmingham > Talks@bham > Data Science and Computational Statistics Seminar > On the geometry of Stein variational gradient descent

On the geometry of Stein variational gradient descent

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Hong Duong.

Bayesian inference problems require sampling or approximating high-dimensional probability distributions. The focus of this talk is on the recently introduced Stein variational gradient descent methodology, a class of algorithms that rely on iterated steepest descent steps with respect to a reproducing kernel Hilbert space norm. This construction leads to interacting particle systems, the mean-field limit of which is a gradient flow on the space of probability distributions equipped with a certain geometrical structure. We leverage this viewpoint to shed some light on the convergence properties of the algorithm, in particular addressing the problem of choosing a suitable positive definite kernel function. Our analysis leads us to considering certain singular kernels with adjusted tails. This is joint work with N. Nusken (U. of Potsdam) and L. Szpruch (U. Edinburgh).

This talk is part of the Data Science and Computational Statistics Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on from the University of Cambridge.