![]() |
![]() |
University of Birmingham > Talks@bham > Data Science and Computational Statistics Seminar > Maximum Conditional Entropy Hamiltonian Monte Carlo Sampler
Maximum Conditional Entropy Hamiltonian Monte Carlo SamplerAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact Hong Duong. The performance of Hamiltonian Monte Carlo (HMC) sampler depends critically on some algorithm parameters such as the total integration time and the numerical integration stepsize. The parameter tuning is particularly challenging when the mass matrix of the HMC sampler is adapted. We propose in this work a Kolmogorov-Sinai entropy (KSE) based design criterion to optimize these algorithm parameters, which can avoid some potential issues in the often used jumping-distance based measures. For near-Gaussian distributions, we are able to derive the optimal algorithm parameters with respect to the KSE criterion analytically. As a byproduct the KSE criterion also provides a theoretical justification for the need to adapt the mass matrix in HMC sampler. Based on the results, we propose an adaptive HMC algorithm, and we then demonstrate the performance of the proposed algorithm with numerical examples. This talk is part of the Data Science and Computational Statistics Seminar series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsComputer Science Distinguished Seminars Contemporary History Seminar http://talks.bham.ac.uk/show/index/1942Other talksQuantum simulations using ultra cold ytterbium TBC Provably Convergent Plug-and-Play Quasi-Newton Methods for Imaging Inverse Problems Geometry of alternating projections in metric spaces with bounded curvature Modelling uncertainty in image analysis. Sensing and metrology activities at NPL, India |