Home > Research > Publications & Outputs > Pseudo-extended Markov chain Monte Carlo

Electronic data

View graph of relations

Pseudo-extended Markov chain Monte Carlo

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paper

Forthcoming
Close
Publication date4/09/2019
Host publicationThirty-third Conference on Neural Information Processing Systems
PublisherNeural information processing systems foundation
Volume33
Original languageEnglish

Publication series

NameNeural Information Processing Systems

Abstract

Sampling from the posterior distribution using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations to fully explore the correct posterior. This is often the case when the posterior of interest is multi-modal, as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as an approach for improving the mixing of the MCMC sampler in complex posterior distributions. The pseudo-extended method augments the state-space of the posterior using pseudo-samples as auxiliary variables, where on the extended space, the MCMC sampler is able to easily move between the well-separated modes of the posterior. We apply the pseudo-extended method within an Hamiltonian Monte Carlo sampler and show that by using the No U-turn algorithm (Hoffman and Gelman, 2014), our proposed sampler is completely tuning free. We compare the pseudo-extended method against well-known tempered MCMC algorithms and show the advantages of the new sampler on a number of challenging examples from the statistics literature.