University of Birmingham > Talks@bham > Theoretical computer science seminar > ADEV: Sound Automatic Differentiation of Expected Values of Probabilistic Programs

ADEV: Sound Automatic Differentiation of Expected Values of Probabilistic Programs

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact George Kaye.

Zoom details

Abstract

Optimizing the expected values of probabilistic processes is a central problem in computer science and its applications, arising in fields ranging from artificial intelligence to operations research to statistical computing. Unfortunately, automatic differentiation techniques developed for deterministic programs do not in general compute the correct gradients needed for widely used solutions relying on gradient-based optimization.

I will talk about ADEV , an extension to forward-mode automatic differentiation that correctly differentiates the expectations of probabilistic processes represented as programs that make random choices. Our algorithm is a source-to-source program transformation on an expressive, higher-order language for probabilistic computation, with both discrete and continuous probability distributions. The result of our transformation is a new probabilistic program, whose expected return value is the derivative of the original program’s expectation. This output program can be run to generate unbiased estimates of the desired gradient. As it modularly extends forward-mode AD, our algorithm lends itself to a concise implementation strategy, which we exploit to develop a prototype in just a few dozen lines of Haskell (https://github.com/probcomp/adev).

This talk is part of the Theoretical computer science seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.