Home > Research > Publications & Outputs > Tilted variational bayes
View graph of relations

Tilted variational bayes

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Tilted variational bayes. / Hensman, James; Zwießele, Max; Lawrence, Neil D.
In: Proceedings of Machine Learning Research, Vol. 33, 2014, p. 356-364.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Hensman, J, Zwießele, M & Lawrence, ND 2014, 'Tilted variational bayes', Proceedings of Machine Learning Research, vol. 33, pp. 356-364. <http://jmlr.org/proceedings/papers/v33/hensman14.html>

APA

Hensman, J., Zwießele, M., & Lawrence, N. D. (2014). Tilted variational bayes. Proceedings of Machine Learning Research, 33, 356-364. http://jmlr.org/proceedings/papers/v33/hensman14.html

Vancouver

Hensman J, Zwießele M, Lawrence ND. Tilted variational bayes. Proceedings of Machine Learning Research. 2014;33:356-364.

Author

Hensman, James ; Zwießele, Max ; Lawrence, Neil D. / Tilted variational bayes. In: Proceedings of Machine Learning Research. 2014 ; Vol. 33. pp. 356-364.

Bibtex

@article{3bda185f26a64d39b7e6812b5dd8426f,
title = "Tilted variational bayes",
abstract = "We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at github.com/SheffieldML/TVB.",
author = "James Hensman and Max Zwie{\ss}ele and Lawrence, {Neil D.}",
year = "2014",
language = "English",
volume = "33",
pages = "356--364",
journal = "Proceedings of Machine Learning Research",
issn = "1938-7228",

}

RIS

TY - JOUR

T1 - Tilted variational bayes

AU - Hensman, James

AU - Zwießele, Max

AU - Lawrence, Neil D.

PY - 2014

Y1 - 2014

N2 - We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at github.com/SheffieldML/TVB.

AB - We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood. We apply the method to Gaussian process classification, a situation where the Kullback-Leibler divergence minimized in traditional VB can be infinite, and to robust Gaussian process regression, where the inference process is dramatically simplified in comparison to EP. Code to reproduce all the experiments can be found at github.com/SheffieldML/TVB.

M3 - Journal article

AN - SCOPUS:84955490168

VL - 33

SP - 356

EP - 364

JO - Proceedings of Machine Learning Research

JF - Proceedings of Machine Learning Research

SN - 1938-7228

ER -