Bernoulli-logistic Latent Gaussian Models (bLGMs) subsume many popular models for binary data, such as Bayesian logistic regression, Gaussian process classification, probabilistic principal components analysis, and factor analysis. Fitting this models is difficult due to an intractable logistic-Gaussian integral in the marginal likelihood. Even the standard variational framework, which involves application of Jensen’s inequality, does not make the integral tractable.
In this work, we propose the use of fi xed piecewise linear and quadratic upper bounds to the logistic-log-partition (LLP) function as a way of circumventing this intractable integral. We describe a framework for approximately computing minimax optimal piecewise quadratic bounds, as well a generalized expectation maximization algorithm based on using piecewise bounds to estimate bLGMs. We prove a theoretical result relating the maximum error in the LLP bound to the maximum error in the marginal likelihood estimate.
Through application to real-world data, we show that the proposed bounds achieve better estimation accuracy than existing variational bounds with a little increase in computation. We also show that, unlike existing sampling methods, our methods offer guaranteed convergence, easy convergence diagnostics, and scale well to datasets containing thousands of variables and instances. Finally, we illustrate the application of our bounds to model categorical and ordinal data with latent Gaussian models.
This is joint work with Kevin Murphy and Benjamin Marlin.