Fast Variational Inference in the Conjugate Exponential Family

  • James Hensman | University of Sheffield

We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic inference using our bound.

Speaker Details

James Hensman completed his PhD in the Dynamics Research Group in the engineering department at the University of Sheffield in 2009. After completing an EPSRC prize fellowship to apply machine learning methods in the monitoring of structural aircraft components, he moved to Sheffield’s machine learning group. His current research interests lie in large scale probabilistic models with applications in engineering and bioinformatics.

    • Portrait of Jeff Running

      Jeff Running