Learning structure of power-law Markov networks

  • Abhik Das ,
  • Praneeth Netrapalli ,
  • Sujay Sanghavi ,
  • Sriram Vishwanath

IEEE International Symposium on Information Theory (ISIT) 2014 |

Published by IEEE

This paper considers the problem of learning the underlying graph structure of discrete Markov networks based on power-law graphs, generated using the configuration model. This paper translates the learning problem into an equivalent channel coding problem and analyzes the necessary conditions in terms of problem parameters. In particular, the exponent of power-law graph is related to the hardness of the learning problem, showing that a greater number of samples is required for exact recovery of discrete power-law Markov graphs with small exponent values. An efficient learning algorithm for accurately reconstructing the structure of Ising model based on power-law graphs is developed. Finally, it is shown that the order-wise optimal number of samples suffices for recovering the exact graph under certain constraints on the Ising model parameters and scalings of node degree.