Home > Research > Publications & Outputs > Fast variational inference in the conjugate exp...
View graph of relations

Fast variational inference in the conjugate exponential family

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
  • James Hensman
  • Magnus Rattray
  • Neil D. Lawrence
Close
Publication date2012
Host publicationAdvances in Neural Information Processing Systems
Pages2888-2896
Number of pages9
Volume4
<mark>Original language</mark>English
Event26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012 - Lake Tahoe, NV, United States
Duration: 3/12/20126/12/2012

Conference

Conference26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012
Country/TerritoryUnited States
CityLake Tahoe, NV
Period3/12/126/12/12

Conference

Conference26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012
Country/TerritoryUnited States
CityLake Tahoe, NV
Period3/12/126/12/12

Abstract

We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic inference using our bound.