University of Birmingham > Talks@bham > Data Science and Computational Statistics Seminar > Maximum Conditional Entropy Hamiltonian Monte Carlo Sampler

## Maximum Conditional Entropy Hamiltonian Monte Carlo SamplerAdd to your list(s) Download to your calendar using vCal - Jinglai Li (School of Mathematics, University of Birmingham)
- Tuesday 07 July 2020, 13:00-14:00
- via Zoom, Meeting ID: 326 414 7316, Password: 772020.
If you have a question about this talk, please contact Hong Duong. The performance of Hamiltonian Monte Carlo (HMC) sampler depends critically on some algorithm parameters such as the total integration time and the numerical integration stepsize. The parameter tuning is particularly challenging when the mass matrix of the HMC sampler is adapted. We propose in this work a Kolmogorov-Sinai entropy (KSE) based design criterion to optimize these algorithm parameters, which can avoid some potential issues in the often used jumping-distance based measures. For near-Gaussian distributions, we are able to derive the optimal algorithm parameters with respect to the KSE criterion analytically. As a byproduct the KSE criterion also provides a theoretical justification for the need to adapt the mass matrix in HMC sampler. Based on the results, we propose an adaptive HMC algorithm, and we then demonstrate the performance of the proposed algorithm with numerical examples. This talk is part of the Data Science and Computational Statistics Seminar series. ## This talk is included in these lists:- Data Science and Computational Statistics Seminar
- School of Mathematics events
- via Zoom, Meeting ID: 326 414 7316, Password: 772020
Note that ex-directory lists are not shown. |
## Other listsAnalysis seminar Computer Science Lunch Time Talk Series Theoretical Physics Seminars## Other talksTBC TBA The tragic destiny of Mileva Marić Einstein Counting cycles in planar graphs Hunt for an Earth-twin Wave turbulence in the Schrödinger-Helmholtz equation |