Research

Hi! PARIS Seminar – Bob Carpenter, 11 October 2022

Hi! PARIS is pleased to propose a scientific seminar by Bob Carpenter for the Hi! Tuesday of October.

Thanks to Marylou Gabrié, Hi! PARIS Chair holder 2021 and Assistant professor at École Polytechnique (CMAP), we have the pleasure to welcome Bob Carpenter for an Hi! PARIS seminar, entitled “Pathfinder: Quasi-Newton Variational Inference”.

Tuesday 11 October 2022, 2.00-3.30pm
Telecom Paris, Palaiseau (room 0A128)
On site + Zoom
Pathfinder: Quasi-Newton Variational Inference

The speaker will introduce the Pathfinder variational inference algorithm, which was motivated by finding good initializations for Markov chain Monte Carlo (i.e., solving the “burn-in” problem).  It works by running quasi-Newton optimization (specifically, L-BFGS) on the target
posterior (not the stochastic ELBO, as in other black-box variational infernece algorithms).  At each iteration of optimization, Pathfinder defines a variational approximation to the posterior, in the form of a multivariate normal distribution taking the low-rank plus diagonal inverse Hessian from the optimizer as covariance.  It then selects the approximation with the lowest KL-divergence to the true posterior. Multi-path Pathfinder runs multiple instances of Pathfinder in parallel and then uses importance resampling to produce a final set of draws. The single-path algorithm provides much better approximations (measured by Wasserstein distance or KL-divergence) than the previous state-of-the-art mean-field or full-rank black box variational
inference schemes, and the multi-path algorithm is much better again for posteriors with multiple modes or complex geometry.  The computational bottleneck is evaluating KL-divergence through the evidence lower bound (ELBO), but this step is embarassingly parallelizable.   Even without parallelization, Pathfinder is one to three orders of magnitude faster than the state of the art black box variational inference or using the no-U-turn Hamiltonian Monte Carlo sampler for warmup.  It is also much more robust.  We will show the
results of evaluating on dozens of different models in the posteriordb test suite and also a range of high-dimensional and multimodal problems.  This is joint work with Lu Zhang (first author who did most of the hard work), Aki Vehtari, and Andrew Gelman.

Paper:  https://arxiv.org/abs/2108.03782
R implementation and evaluation:  https://github.com/LuZhangstat/Pathfinder
C++ implementation for Stan:  https://github.com/stan-dev/stan/pull/3123

Bob Carpenter 
Bob Carpenter is a research scientist at Flatiron Institute’s Center for Computational Mathematics.  He works on probabilistic programming languages, statistical inference algorithms, and applied statistics, primarily within the Stan community (https://mc-stan.org).
Before moving into statistics, Bob worked on theoretical linguistics, logic programming, natural language processing, speech recognition, and search, both in industry and academia.

Keywords: computational statistics ; probabilistic programming ; automatic differentiation ; natural language processing