Regularised Zero-Variance Control Variates for High-Dimensional Variance Reduction

11/13/2018
by   Leah F. South, et al.
0

Zero-variance control variates (ZV-CV) are a post-processing method to reduce the variance of Monte Carlo estimators of expectations using the derivatives of the log target. Once the derivatives are available, the only additional computational effort is solving a linear regression problem. Significant variance reductions have been achieved with this method in low dimensional examples, but the number of covariates in the regression rapidly increases with the dimension of the target. We propose to exploit penalised regression to make the method more flexible and feasible, particularly in higher dimensions. Connections between this penalised ZV-CV approach and control functionals are made, providing additional motivation for our approach. Another type of regularisation based on using subsets of derivatives, or a priori regularisation as we refer to it in this paper, is also proposed to reduce computational and storage requirements. Methods for applying ZV-CV and regularised ZV-CV to sequential Monte Carlo (SMC) are described and a new estimator for the normalising constant of the posterior is developed to aid Bayesian model choice. Several examples showing the utility and limitations of regularised ZV-CV for Bayesian inference are given. The methods proposed in this paper are accessible through the R package ZVCV available at https://github.com/LeahPrice/ZVCV .

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset