Home > Research > Publications & Outputs > MCMC for variationally sparse Gaussian processes
View graph of relations

MCMC for variationally sparse Gaussian processes

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

MCMC for variationally sparse Gaussian processes. / Hensman, James; De Matthews, Alexander G.; Filippone, Maurizio et al.
Advances in Neural Information Processing Systems. Vol. 2015 Neural information processing systems foundation, 2015. p. 1648-1656.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Hensman, J, De Matthews, AG, Filippone, M & Ghahramani, Z 2015, MCMC for variationally sparse Gaussian processes. in Advances in Neural Information Processing Systems. vol. 2015, Neural information processing systems foundation, pp. 1648-1656, 29th Annual Conference on Neural Information Processing Systems, NIPS 2015, Montreal, Canada, 7/12/15.

APA

Hensman, J., De Matthews, A. G., Filippone, M., & Ghahramani, Z. (2015). MCMC for variationally sparse Gaussian processes. In Advances in Neural Information Processing Systems (Vol. 2015, pp. 1648-1656). Neural information processing systems foundation.

Vancouver

Hensman J, De Matthews AG, Filippone M, Ghahramani Z. MCMC for variationally sparse Gaussian processes. In Advances in Neural Information Processing Systems. Vol. 2015. Neural information processing systems foundation. 2015. p. 1648-1656

Author

Hensman, James ; De Matthews, Alexander G. ; Filippone, Maurizio et al. / MCMC for variationally sparse Gaussian processes. Advances in Neural Information Processing Systems. Vol. 2015 Neural information processing systems foundation, 2015. pp. 1648-1656

Bibtex

@inproceedings{75b290859a0d41aeae589d15fc4810f3,
title = "MCMC for variationally sparse Gaussian processes",
abstract = "Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate the posterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper is available at github.com/sparseMCMC.",
author = "James Hensman and {De Matthews}, {Alexander G.} and Maurizio Filippone and Zoubin Ghahramani",
year = "2015",
language = "English",
volume = "2015",
pages = "1648--1656",
booktitle = "Advances in Neural Information Processing Systems",
publisher = "Neural information processing systems foundation",
note = "29th Annual Conference on Neural Information Processing Systems, NIPS 2015 ; Conference date: 07-12-2015 Through 12-12-2015",

}

RIS

TY - GEN

T1 - MCMC for variationally sparse Gaussian processes

AU - Hensman, James

AU - De Matthews, Alexander G.

AU - Filippone, Maurizio

AU - Ghahramani, Zoubin

PY - 2015

Y1 - 2015

N2 - Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate the posterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper is available at github.com/sparseMCMC.

AB - Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate the posterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper is available at github.com/sparseMCMC.

M3 - Conference contribution/Paper

AN - SCOPUS:84965135994

VL - 2015

SP - 1648

EP - 1656

BT - Advances in Neural Information Processing Systems

PB - Neural information processing systems foundation

T2 - 29th Annual Conference on Neural Information Processing Systems, NIPS 2015

Y2 - 7 December 2015 through 12 December 2015

ER -