Computationally Efficient Methods for Selecting Among Mixtures of Graphical Models

in Bayesian Statistics 6

Published by Oxford University Press | 1999 | Bayesian Statistics 6 edition

Publication

We describe computationally efficient methods for Bayesian model selection. The methods select among mixtures in which each component is a directed acyclic graphical model (mixtures of DAGs or MDAGs), and can be applied to data sets in which some of the random variables are not always observed. The model-selection criterion that we consider is the posterior probability of the model (structure) given data. Our model-selection problem is dicult because (1) the number of possible model structures grows super-exponentially with the number of random variables and (2) missing data necessitates the use of computationally slow approximations of model posterior probability. We argue that simple search-and-score algorithms are infeasible for a variety of problems, and introduce a feasible approach in which parameter and structure search is interleaved and expected data is treated as real data. Our approach can be viewed as a combination of the Cheeseman Stutz asymptotic approximation for model posterior probability and the Expectation Maximization algorithm. We evaluate our procedure for selecting among MDAGs on synthetic and real examples.