Home > Research > Publications & Outputs > Continuously-Tempered PDMP samplers

Electronic data

  • NeurIPS__Tempering_PDMP-2

    Accepted author manuscript, 933 KB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

View graph of relations

Continuously-Tempered PDMP samplers

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Forthcoming
<mark>Journal publication date</mark>14/09/2022
<mark>Journal</mark>Advances in Neural Information Processing Systems
Issue number28293--28304
Volume35
Publication StatusAccepted/In press
<mark>Original language</mark>English

Abstract

New sampling algorithms based on simulating continuous-time stochastic processes called piecewise deterministic Markov processes (PDMPs) have shown considerable promise. However, these methods can struggle to sample from multi-modal or heavy-tailed distributions. We show how tempering ideas can improve the mixing of PDMPs in such cases. We introduce an extended distribution defined over the state of the posterior distribution and an inverse temperature, which interpolates between a tractable distribution when the inverse temperature is 0 and the posterior when the inverse temperature is 1. The marginal distribution of the inverse temperature is a mixture of a continuous distribution on [0,1) and a point mass at 1: which means that we obtain samples when the inverse temperature is 1, and these are draws from the posterior, but sampling algorithms will also explore distributions at lower temperatures which will improve mixing. We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution. The resulting algorithm is easy to implement and we show empirically that it can outperform existing PDMP-based samplers on challenging multimodal posteriors.