Home > Research > Publications & Outputs > Transport Elliptical Slice Sampling

Links

View graph of relations

Transport Elliptical Slice Sampling

Research output: Contribution to Journal/MagazineConference articlepeer-review

Published

Standard

Transport Elliptical Slice Sampling. / Cabezas, Alberto; Nemeth, Christopher.
In: Proceedings of Machine Learning Research, Vol. 206, 25.04.2023, p. 3664-3676.

Research output: Contribution to Journal/MagazineConference articlepeer-review

Harvard

Cabezas, A & Nemeth, C 2023, 'Transport Elliptical Slice Sampling', Proceedings of Machine Learning Research, vol. 206, pp. 3664-3676. <https://proceedings.mlr.press/v206/cabezas23a/cabezas23a.pdf>

APA

Vancouver

Cabezas A, Nemeth C. Transport Elliptical Slice Sampling. Proceedings of Machine Learning Research. 2023 Apr 25;206:3664-3676.

Author

Cabezas, Alberto ; Nemeth, Christopher. / Transport Elliptical Slice Sampling. In: Proceedings of Machine Learning Research. 2023 ; Vol. 206. pp. 3664-3676.

Bibtex

@article{a60629535c794f328a49c9f6a4fab168,
title = "Transport Elliptical Slice Sampling",
abstract = "We propose a new framework for efficiently sampling from complex probability distributions using a combination of normalizing flows and elliptical slice sampling (Murray et al., 2010). The central idea is to learn a diffeomorphism, through normalizing flows, that maps the non-Gaussian structure of the target distribution to an approximately Gaussian distribution. We then use the elliptical slice sampler, an efficient and tuning-free Markov chain Monte Carlo (MCMC) algorithm, to sample from the transformed distribution. The samples are then pulled back using the inverse normalizing flow, yielding samples that approximate the stationary target distribution of interest. Our transport elliptical slice sampler (TESS) is optimized for modern computer architectures, where its adaptation mechanism utilizes parallel cores to rapidly run multiple Markov chains for a few iterations. Numerical demonstrations show that TESS produces Monte Carlo samples from the target distribution with lower autocorrelation compared to non-transformed samplers, and demonstrates significant improvements in efficiency when compared to gradient-based proposals designed for parallel computer architectures, given a flexible enough diffeomorphism.",
author = "Alberto Cabezas and Christopher Nemeth",
year = "2023",
month = apr,
day = "25",
language = "English",
volume = "206",
pages = "3664--3676",
journal = "Proceedings of Machine Learning Research",
issn = "2640-3498",
publisher = "ML Research Press",
note = "26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 ; Conference date: 25-04-2023 Through 27-04-2023",

}

RIS

TY - JOUR

T1 - Transport Elliptical Slice Sampling

AU - Cabezas, Alberto

AU - Nemeth, Christopher

PY - 2023/4/25

Y1 - 2023/4/25

N2 - We propose a new framework for efficiently sampling from complex probability distributions using a combination of normalizing flows and elliptical slice sampling (Murray et al., 2010). The central idea is to learn a diffeomorphism, through normalizing flows, that maps the non-Gaussian structure of the target distribution to an approximately Gaussian distribution. We then use the elliptical slice sampler, an efficient and tuning-free Markov chain Monte Carlo (MCMC) algorithm, to sample from the transformed distribution. The samples are then pulled back using the inverse normalizing flow, yielding samples that approximate the stationary target distribution of interest. Our transport elliptical slice sampler (TESS) is optimized for modern computer architectures, where its adaptation mechanism utilizes parallel cores to rapidly run multiple Markov chains for a few iterations. Numerical demonstrations show that TESS produces Monte Carlo samples from the target distribution with lower autocorrelation compared to non-transformed samplers, and demonstrates significant improvements in efficiency when compared to gradient-based proposals designed for parallel computer architectures, given a flexible enough diffeomorphism.

AB - We propose a new framework for efficiently sampling from complex probability distributions using a combination of normalizing flows and elliptical slice sampling (Murray et al., 2010). The central idea is to learn a diffeomorphism, through normalizing flows, that maps the non-Gaussian structure of the target distribution to an approximately Gaussian distribution. We then use the elliptical slice sampler, an efficient and tuning-free Markov chain Monte Carlo (MCMC) algorithm, to sample from the transformed distribution. The samples are then pulled back using the inverse normalizing flow, yielding samples that approximate the stationary target distribution of interest. Our transport elliptical slice sampler (TESS) is optimized for modern computer architectures, where its adaptation mechanism utilizes parallel cores to rapidly run multiple Markov chains for a few iterations. Numerical demonstrations show that TESS produces Monte Carlo samples from the target distribution with lower autocorrelation compared to non-transformed samplers, and demonstrates significant improvements in efficiency when compared to gradient-based proposals designed for parallel computer architectures, given a flexible enough diffeomorphism.

M3 - Conference article

AN - SCOPUS:85165147454

VL - 206

SP - 3664

EP - 3676

JO - Proceedings of Machine Learning Research

JF - Proceedings of Machine Learning Research

SN - 2640-3498

T2 - 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023

Y2 - 25 April 2023 through 27 April 2023

ER -