Home > Research > Publications & Outputs > Optimal scaling for partially updating MCMC alg...
View graph of relations

Optimal scaling for partially updating MCMC algorithms

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Optimal scaling for partially updating MCMC algorithms. / Neal, Peter John; Roberts, Gareth .
In: Annals of Applied Probability, Vol. 16, No. 2, 05.2006, p. 475-515.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Neal, PJ & Roberts, G 2006, 'Optimal scaling for partially updating MCMC algorithms', Annals of Applied Probability, vol. 16, no. 2, pp. 475-515. <http://projecteuclid.org/download/pdfview_1/euclid.aoap/1151592241>

APA

Vancouver

Neal PJ, Roberts G. Optimal scaling for partially updating MCMC algorithms. Annals of Applied Probability. 2006 May;16(2):475-515.

Author

Neal, Peter John ; Roberts, Gareth . / Optimal scaling for partially updating MCMC algorithms. In: Annals of Applied Probability. 2006 ; Vol. 16, No. 2. pp. 475-515.

Bibtex

@article{91c13a47265c4dcab8672bd852f8b058,
title = "Optimal scaling for partially updating MCMC algorithms",
abstract = "In this paper we shall consider optimal scaling problems for high-dimensional Metropolis–Hastings algorithms where updates can be chosen to be lower dimensional than the target density itself. We find that the optimal scaling rule for the Metropolis algorithm, which tunes the overall algorithm acceptance rate to be 0.234, holds for the so-called Metropolis-within-Gibbs algorithm as well. Furthermore, the optimal efficiency obtainable is independent of the dimensionality of the update rule. This has important implications for the MCMC practitioner since high-dimensional updates are generally computationally more demanding, so that lower-dimensional updates are therefore to be preferred. Similar results with rather different conclusions are given for so-called Langevin updates. In this case, it is found that high-dimensional updates are frequently most efficient, even taking into account computing costs.",
author = "Neal, {Peter John} and Gareth Roberts",
year = "2006",
month = may,
language = "English",
volume = "16",
pages = "475--515",
journal = "Annals of Applied Probability",
issn = "1050-5164",
publisher = "Institute of Mathematical Statistics",
number = "2",

}

RIS

TY - JOUR

T1 - Optimal scaling for partially updating MCMC algorithms

AU - Neal, Peter John

AU - Roberts, Gareth

PY - 2006/5

Y1 - 2006/5

N2 - In this paper we shall consider optimal scaling problems for high-dimensional Metropolis–Hastings algorithms where updates can be chosen to be lower dimensional than the target density itself. We find that the optimal scaling rule for the Metropolis algorithm, which tunes the overall algorithm acceptance rate to be 0.234, holds for the so-called Metropolis-within-Gibbs algorithm as well. Furthermore, the optimal efficiency obtainable is independent of the dimensionality of the update rule. This has important implications for the MCMC practitioner since high-dimensional updates are generally computationally more demanding, so that lower-dimensional updates are therefore to be preferred. Similar results with rather different conclusions are given for so-called Langevin updates. In this case, it is found that high-dimensional updates are frequently most efficient, even taking into account computing costs.

AB - In this paper we shall consider optimal scaling problems for high-dimensional Metropolis–Hastings algorithms where updates can be chosen to be lower dimensional than the target density itself. We find that the optimal scaling rule for the Metropolis algorithm, which tunes the overall algorithm acceptance rate to be 0.234, holds for the so-called Metropolis-within-Gibbs algorithm as well. Furthermore, the optimal efficiency obtainable is independent of the dimensionality of the update rule. This has important implications for the MCMC practitioner since high-dimensional updates are generally computationally more demanding, so that lower-dimensional updates are therefore to be preferred. Similar results with rather different conclusions are given for so-called Langevin updates. In this case, it is found that high-dimensional updates are frequently most efficient, even taking into account computing costs.

M3 - Journal article

VL - 16

SP - 475

EP - 515

JO - Annals of Applied Probability

JF - Annals of Applied Probability

SN - 1050-5164

IS - 2

ER -