Structural Expectation Propagation (SEP): Bayesian Structure Learning For Networks With Latent Variables
- Nevena Lazic ,
- Christopher Bishop ,
- John Winn
Proceedings Sixteenth International Conference on Artificial Intelligence and Statistics (AIStats) |
Published by AISTATS
Learning the structure of discrete Bayesian networks has been the subject of extensive research in machine learning, with most Bayesian approaches focusing on fully observed networks. One of the few methods that can handle networks with latent variables is the “structural EM algorithm” which interleaves greedy structure search with the estimation of latent variables and parameters, maintaining a single best network at each step. We introduce Structural Expectation Propagation (SEP), an extension of EP which can infer the structure of Bayesian networks having latent variables and missing data. SEP performs variational inference in a joint model of structure, latent variables, and parameters, offering two advantages: (i) it accounts for uncertainty in structure and parameter values when making local distribution updates (ii) it returns a variational distribution over network structures rather than a single network. We demonstrate the performance of SEP both on synthetic problems and on real-world clinical data.