Adaptive MCMC for synthetic likelihoods and correlated synthetic likelihoods
Approximate Bayesian computation (ABC) and synthetic likelihood (SL) are strategies for parameter inference when the likelihood function is analytically or computationally intractable. In SL, the likelihood function of the data is replaced by a multivariate Gaussian density for summary statistics compressing the observed data. While SL is conceptually simpler to implement compared with ABC, it requires simulation of many replicate datasets at every parameter value considered by a sampling algorithm, such as MCMC, making the method very computationally-intensive. We propose two strategies to alleviate the computational burden imposed by SL algorithms. We first introduce a novel adaptive MCMC algorithm for SL where the proposal distribution is sequentially tuned. Second, we exploit existing strategies from the correlated particle filters literature, to improve the MCMC mixing in a SL framework. Additionally, we show how to use Bayesian optimization to rapidly generate promising starting values for SL inference. Our combined goal is to provide ways to make the best out of each expensive MCMC iteration when using synthetic likelihoods algorithms, which will broaden the scope of these methods for complex modeling problems with costly simulators. To illustrate the advantages stemming from our framework we consider three benchmarking examples, including estimation of parameters for a cosmological model.
READ FULL TEXT