Home > Research > Publications & Outputs > Stochastic gradient Markov chain Monte Carlo

Electronic data

  • accepted_version

    Accepted author manuscript, 4.57 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Stochastic gradient Markov chain Monte Carlo

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Stochastic gradient Markov chain Monte Carlo. / Nemeth, Christopher; Fearnhead, Paul.
In: Journal of the American Statistical Association, Vol. 1116, No. 533, 30.03.2021, p. 433-450.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Nemeth, C & Fearnhead, P 2021, 'Stochastic gradient Markov chain Monte Carlo', Journal of the American Statistical Association, vol. 1116, no. 533, pp. 433-450. https://doi.org/10.1080/01621459.2020.1847120

APA

Nemeth, C., & Fearnhead, P. (2021). Stochastic gradient Markov chain Monte Carlo. Journal of the American Statistical Association, 1116(533), 433-450. https://doi.org/10.1080/01621459.2020.1847120

Vancouver

Nemeth C, Fearnhead P. Stochastic gradient Markov chain Monte Carlo. Journal of the American Statistical Association. 2021 Mar 30;1116(533): 433-450. Epub 2021 Jan 4. doi: 10.1080/01621459.2020.1847120

Author

Nemeth, Christopher ; Fearnhead, Paul. / Stochastic gradient Markov chain Monte Carlo. In: Journal of the American Statistical Association. 2021 ; Vol. 1116, No. 533. pp. 433-450.

Bibtex

@article{293dc8b7e6404525865a059049e9212e,
title = "Stochastic gradient Markov chain Monte Carlo",
abstract = "Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold standard technique for Bayesian inference. They are theoretically well-understood and conceptually simple to apply in practice. The drawback of MCMC is that performing exact inference generally requires all of the data to be processed at each iteration of the algorithm. For large data sets, the computational cost of MCMC can be prohibitive, which has led to recent developments in scalable Monte Carlo algorithms that have a significantly lower computational cost than standard MCMC. In this paper, we focus on a particular class of scalable Monte Carlo algorithms, stochastic gradient Markov chain Monte Carlo (SGMCMC) which utilises data subsampling techniques to reduce the per iteration cost of MCMC. We provide an introduction to some popular SGMCMC algorithms and review the supporting theoretical results, as well as comparing the efficiency of SGMCMC algorithms against MCMC on benchmark examples. The supporting R code is available online at https://github.com/chris-nemeth/sgmcmc-review-paper.",
author = "Christopher Nemeth and Paul Fearnhead",
note = "This is an Accepted Manuscript of an article published by Taylor & Francis in Journal of the American Statistical Association in March 2021, available online: https://www.tandfonline.com/doi/full/10.1080/01621459.2020.1847120",
year = "2021",
month = mar,
day = "30",
doi = "10.1080/01621459.2020.1847120",
language = "English",
volume = "1116",
pages = " 433--450",
journal = "Journal of the American Statistical Association",
issn = "0162-1459",
publisher = "Taylor and Francis Ltd.",
number = "533",

}

RIS

TY - JOUR

T1 - Stochastic gradient Markov chain Monte Carlo

AU - Nemeth, Christopher

AU - Fearnhead, Paul

N1 - This is an Accepted Manuscript of an article published by Taylor & Francis in Journal of the American Statistical Association in March 2021, available online: https://www.tandfonline.com/doi/full/10.1080/01621459.2020.1847120

PY - 2021/3/30

Y1 - 2021/3/30

N2 - Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold standard technique for Bayesian inference. They are theoretically well-understood and conceptually simple to apply in practice. The drawback of MCMC is that performing exact inference generally requires all of the data to be processed at each iteration of the algorithm. For large data sets, the computational cost of MCMC can be prohibitive, which has led to recent developments in scalable Monte Carlo algorithms that have a significantly lower computational cost than standard MCMC. In this paper, we focus on a particular class of scalable Monte Carlo algorithms, stochastic gradient Markov chain Monte Carlo (SGMCMC) which utilises data subsampling techniques to reduce the per iteration cost of MCMC. We provide an introduction to some popular SGMCMC algorithms and review the supporting theoretical results, as well as comparing the efficiency of SGMCMC algorithms against MCMC on benchmark examples. The supporting R code is available online at https://github.com/chris-nemeth/sgmcmc-review-paper.

AB - Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold standard technique for Bayesian inference. They are theoretically well-understood and conceptually simple to apply in practice. The drawback of MCMC is that performing exact inference generally requires all of the data to be processed at each iteration of the algorithm. For large data sets, the computational cost of MCMC can be prohibitive, which has led to recent developments in scalable Monte Carlo algorithms that have a significantly lower computational cost than standard MCMC. In this paper, we focus on a particular class of scalable Monte Carlo algorithms, stochastic gradient Markov chain Monte Carlo (SGMCMC) which utilises data subsampling techniques to reduce the per iteration cost of MCMC. We provide an introduction to some popular SGMCMC algorithms and review the supporting theoretical results, as well as comparing the efficiency of SGMCMC algorithms against MCMC on benchmark examples. The supporting R code is available online at https://github.com/chris-nemeth/sgmcmc-review-paper.

U2 - 10.1080/01621459.2020.1847120

DO - 10.1080/01621459.2020.1847120

M3 - Journal article

VL - 1116

SP - 433

EP - 450

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 533

ER -