Conditional Matrix Flows for Gaussian Graphical Models

06/12/2023
by   Marcello Massimo Negri, et al.
0

Studying conditional independence structure among many variables with few observations is a challenging task. Gaussian Graphical Models (GGMs) tackle this problem by encouraging sparsity in the precision matrix through an l_p regularization with p≤1. However, since the objective is highly non-convex for sub-l_1 pseudo-norms, most approaches rely on the l_1 norm. In this case frequentist approaches allow to elegantly compute the solution path as a function of the shrinkage parameter λ. Instead of optimizing the penalized likelihood, the Bayesian formulation introduces a Laplace prior on the precision matrix. However, posterior inference for different λ values requires repeated runs of expensive Gibbs samplers. We propose a very general framework for variational inference in GGMs that unifies the benefits of frequentist and Bayesian frameworks. Specifically, we propose to approximate the posterior with a matrix-variate Normalizing Flow defined on the space of symmetric positive definite matrices. As a key improvement on previous work, we train a continuum of sparse regression models jointly for all regularization parameters λ and all l_p norms, including non-convex sub-l_1 pseudo-norms. This is achieved by conditioning the flow on p>0 and on the shrinkage parameter λ. We have then access with one model to (i) the evolution of the posterior for any λ and for any l_p (pseudo-) norms, (ii) the marginal log-likelihood for model selection, and (iii) we can recover the frequentist solution paths as the MAP, which is obtained through simulated annealing.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset