Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference

10/05/2018
by   Mike Wu, et al.
6

Stochastic optimization techniques are standard in variational inference algorithms. These methods estimate gradients by approximating expectations with independent Monte Carlo samples. In this paper, we explore a technique that uses correlated, but more representative , samples to reduce estimator variance. Specifically, we show how to generate antithetic samples that match sample moments with the true moments of an underlying importance distribution. Combining a differentiable antithetic sampler with modern stochastic variational inference, we showcase the effectiveness of this approach for learning a deep generative model.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset