Learning high-dimensional graphical models with regularized quadratic scoring

09/15/2018
by   Eric Janofsky, et al.
0

Pairwise Markov Random Fields (MRFs) or undirected graphical models are parsimonious representations of joint probability distributions. Variables correspond to nodes of a graph, with edges between nodes corresponding to conditional dependencies. Unfortunately, likelihood-based learning and inference is hampered by the intractability of computing the normalizing constant. This paper considers an alternative scoring rule to the log-likelihood, which obviates the need to compute the normalizing constant of the distribution. We show that the rule is a quadratic function of the natural parameters. We optimize the sum of this scoring rule and a sparsity-inducing regularizer. For general continuous-valued exponential families, we provide theoretical results on parameter and edge consistency. As a special case we detail a new approach to sparse precision matrix estimation whose theoretical guarantees match that of the graphical lasso of Yuan and Lin (2007), with faster computational performance than the glasso algorithm of Yuan (2010). We then describe results for model selection in the nonparametric pairwise graphical model using exponential series. The regularized score matching problem is shown to be a convex program; we provide scalable algorithms based on consensus alternating direction method of multipliers (Boyd et al. (2011)) and coordinate-wise descent.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset