How to Get an Exact Sample From a Generic Markov Chain and Sample a Random Spanning Tree From a Directed Graph, Both within the Cover Time
A general problem in computational probability theory is that of generating a random sample from the state space of a Markov chain in accordance with the steady-state probability law of the chain. Another problem is that of generating a random spanning tree of a graph or spanning arborescence of a directed graph in accordance with the uniform distribution, or more generally in accordance with a distribution given by weights on the edges of the graph or digraph. This paper gives algorithms for both of these problems, improving on earlier results and exploiting the duality between the two problems. Each of the new algorithms hinges on the recently-introduced technique of coupling from the past or on the linked notions of loop-erased random walk and “cycle-popping”.
Copyright © 1996 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Publications Dept, ACM Inc., fax +1 (212) 869-0481, or permissions@acm.org. The definitive version of this paper can be found at ACM's Digital Library -http://www.acm.org/dl/.