Variational Inference via Transformations on Distributions

07/09/2017
by   Siddhartha Saxena, et al.
0

Variational inference methods often focus on the problem of efficient model optimization, with little emphasis on the choice of the approximating posterior. In this paper, we review and implement the various methods that enable us to develop a rich family of approximating posteriors. We show that one particular method employing transformations on distributions results in developing very rich and complex posterior approximation. We analyze its performance on the MNIST dataset by implementing with a Variational Autoencoder and demonstrate its effectiveness in learning better posterior distributions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset