Home > Research > Publications & Outputs > Continuously-Tempered PDMP samplers

Electronic data

  • NeurIPS__Tempering_PDMP-2

    Accepted author manuscript, 933 KB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

View graph of relations

Continuously-Tempered PDMP samplers

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Forthcoming

Standard

Continuously-Tempered PDMP samplers. / Sutton, Matthew; Salomone, Robert; Chevallier, Augustin et al.
In: Advances in Neural Information Processing Systems, Vol. 35, No. 28293--28304, 14.09.2022.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Sutton, M, Salomone, R, Chevallier, A & Fearnhead, P 2022, 'Continuously-Tempered PDMP samplers', Advances in Neural Information Processing Systems, vol. 35, no. 28293--28304.

APA

Sutton, M., Salomone, R., Chevallier, A., & Fearnhead, P. (in press). Continuously-Tempered PDMP samplers. Advances in Neural Information Processing Systems, 35(28293--28304).

Vancouver

Sutton M, Salomone R, Chevallier A, Fearnhead P. Continuously-Tempered PDMP samplers. Advances in Neural Information Processing Systems. 2022 Sept 14;35(28293--28304).

Author

Sutton, Matthew ; Salomone, Robert ; Chevallier, Augustin et al. / Continuously-Tempered PDMP samplers. In: Advances in Neural Information Processing Systems. 2022 ; Vol. 35, No. 28293--28304.

Bibtex

@article{5d78bb0381c1467688d07003e9ec78f5,
title = "Continuously-Tempered PDMP samplers",
abstract = " New sampling algorithms based on simulating continuous-time stochastic processes called piecewise deterministic Markov processes (PDMPs) have shown considerable promise. However, these methods can struggle to sample from multi-modal or heavy-tailed distributions. We show how tempering ideas can improve the mixing of PDMPs in such cases. We introduce an extended distribution defined over the state of the posterior distribution and an inverse temperature, which interpolates between a tractable distribution when the inverse temperature is 0 and the posterior when the inverse temperature is 1. The marginal distribution of the inverse temperature is a mixture of a continuous distribution on [0,1) and a point mass at 1: which means that we obtain samples when the inverse temperature is 1, and these are draws from the posterior, but sampling algorithms will also explore distributions at lower temperatures which will improve mixing. We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution. The resulting algorithm is easy to implement and we show empirically that it can outperform existing PDMP-based samplers on challenging multimodal posteriors.",
author = "Matthew Sutton and Robert Salomone and Augustin Chevallier and Paul Fearnhead",
year = "2022",
month = sep,
day = "14",
language = "English",
volume = "35",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",
number = "28293--28304",

}

RIS

TY - JOUR

T1 - Continuously-Tempered PDMP samplers

AU - Sutton, Matthew

AU - Salomone, Robert

AU - Chevallier, Augustin

AU - Fearnhead, Paul

PY - 2022/9/14

Y1 - 2022/9/14

N2 - New sampling algorithms based on simulating continuous-time stochastic processes called piecewise deterministic Markov processes (PDMPs) have shown considerable promise. However, these methods can struggle to sample from multi-modal or heavy-tailed distributions. We show how tempering ideas can improve the mixing of PDMPs in such cases. We introduce an extended distribution defined over the state of the posterior distribution and an inverse temperature, which interpolates between a tractable distribution when the inverse temperature is 0 and the posterior when the inverse temperature is 1. The marginal distribution of the inverse temperature is a mixture of a continuous distribution on [0,1) and a point mass at 1: which means that we obtain samples when the inverse temperature is 1, and these are draws from the posterior, but sampling algorithms will also explore distributions at lower temperatures which will improve mixing. We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution. The resulting algorithm is easy to implement and we show empirically that it can outperform existing PDMP-based samplers on challenging multimodal posteriors.

AB - New sampling algorithms based on simulating continuous-time stochastic processes called piecewise deterministic Markov processes (PDMPs) have shown considerable promise. However, these methods can struggle to sample from multi-modal or heavy-tailed distributions. We show how tempering ideas can improve the mixing of PDMPs in such cases. We introduce an extended distribution defined over the state of the posterior distribution and an inverse temperature, which interpolates between a tractable distribution when the inverse temperature is 0 and the posterior when the inverse temperature is 1. The marginal distribution of the inverse temperature is a mixture of a continuous distribution on [0,1) and a point mass at 1: which means that we obtain samples when the inverse temperature is 1, and these are draws from the posterior, but sampling algorithms will also explore distributions at lower temperatures which will improve mixing. We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution. The resulting algorithm is easy to implement and we show empirically that it can outperform existing PDMP-based samplers on challenging multimodal posteriors.

M3 - Journal article

VL - 35

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

IS - 28293--28304

ER -