University of Birmingham > Talks@bham > Applied Mathematics Seminar Series > Scaling Limits in Computational Bayesian Inversion

Scaling Limits in Computational Bayesian Inversion

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Alexandra Tzella.

In this talk, we will discuss a parametric deterministic formulation of Bayesian inverse problems with distributed parameter uncertainty from infinite dimensional, separable Banach spaces, with uniform prior probability measure on the uncertain parameter. The underlying forward problems are parametric, deterministic operator equations, and computational Bayesian inversion is to evaluate expectations of quantities of interest under the Bayesian posterior, conditional on given noisy observational data.

For forward problems belonging to a certain sparsity class, we quantify analytic regularity of the Bayesian posterior and prove that the parametric, deterministic density of the Bayesian posterior belongs to the same sparsity class. These results suggest in particular dimension-independent convergence rates for data-adaptive Smolyak integration algorithms, but the error bounds depend exponentially on the inverse of the covariance of the additive, Gaussian observation noise. We will discuss asymptotic expansions of the Bayesian estimates, which can be used to construct quadrature methods combined with a curvature-based reparametrization of the parametric Bayesian posterior density near the (assumed unique) global maximum of the posterior density leading to convergence with rates independent of the number of parameters as well as of the observation noise variance.

This talk is part of the Applied Mathematics Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.