UAI ’97 Full-Day Course on Uncertain Reasoning


Time Session

Opening Remarks
Dan Geiger and Prakash P. Shenoy

Part I: Foundations


Fundamental Principles of Representation and Inference
Instructor: Ross Shachter, Stanford University

Graphical Models in the Real World
Instructors: Mark Peot and Michael Shwe, Knowledge Industries


Coffee Break

A Unifying View on Inference
Instructor: Rina Dechter, University of California–Irvine

Part II: Advanced Topics
Advances in Learning Bayesian Networks
Instructor: David Heckerman, Microsoft Research
Approximate Inference via Variational Techniques
Instructor: Michael Jordan, M.I.T.
Coffee Break
Causality: From Metaphysics to Inference and Reasoning I
Instructor: Judea Pearl, UCLA
Causality: From Metaphysics to Inference and Reasoning II
Instructor: Judea Pearl, UCLA


Advances in Learning Bayesian Networks

Instructor: David Heckerman, Microsoft Research

David Heckerman will discuss methods for learning with Bayesian networks–namely, methods for updating the parameters and structure of a Bayesian network given data. He will begin with a review of Bayesian statistics, touching on the concepts of subjective probability, objective probability, random sample, exponential family, sufficient statistics, and conjugate priors. He will then discuss how methods for model averaging and model selection from Bayesian statistics can be adapted to Bayesian-network learning. Topics will include criteria for model selection, techniques for assigning priors, and search methods. Time permitting, he will discuss methods for handling missing data, including Monte-Carlo and Gaussian approximations. At least one real-world application will be presented.

Approximate Inference via Variational Techniques

Instructor: Michael Jordan, M.I.T.

For many graphical models of practical interest, exact inferential calculations are intractable and approximations must be developed. In this tutorial Jordan will describe the principles behind the use of variational methods for approximate inference. These methods provide bounds (upper and lower) on probabilities on graphs. They are complementary to the exact techniques in the sense that they tend to be more accurate for dense networks than for sparse networks; moreover, they can readily be combined with exact techniques. Jordan will describe the application of variational ideas in a number of settings, including the QMR database. (This is joint work with Zoubin Ghahramani, Tommi Jaakkola, and Lawrence Saul).

Causality: From Metaphysics to Inference and Reasoning I

Instructor: Judea Pearl, UCLA

The traditional conception of Bayesian networks as carriers of conditional independence information is rapidly giving way to a causal conception, based on mechanisms and interventions. The result is a more natural understanding of what the networks stand for, what judgments are used in constructing these network and, most importantly, how actions and plans are to be handled within the framework of standard probability theory. Pearl’s aim in this tutorial is to explain the mathematical foundation and inferential capabilities of causal Bayesian networks, and to advocate their use as the standard tool of analysis. To this end, Pearl will focus on the non-controversial aspects of causation and on the basic tools and skills required for the solution of tangible causal problems. Philosophical speculations will be kept to a minimum. Starting with functional description of physical mechanisms. we will derive the standard probabilistic properties of Bayesian networks and show, additionally:

  • How the effects of unanticipated actions can be predicted from the network topology
  • How qualitative judgments can be integrated with statistical data (with unobserved variables) to assess the strength of causal influences
  • How actions interact with observations
  • How counterfactuals sentences can be interpreted and evaluated
  • What assumptions are needed for inferring causes from data, and what guarantees accompany such inference