Sequential pCN-MCMC, an efficient MCMC method for Bayesian inversion of high-dimensional multi-Gaussian priors

03/24/2021
by   Sebastian Reuschen, et al.
0

In geostatistics, Gaussian random fields are often used to model heterogeneities of soil or subsurface parameters. To give spatial approximations of these random fields, they are discretized. Then, different techniques of geostatistical inversion are used to condition them on measurement data. Among these techniques, Markov chain Monte Carlo (MCMC) techniques stand out, because they yield asymptotically unbiased conditional realizations. However, standard Markov Chain Monte Carlo (MCMC) methods suffer the curse of dimensionality when refining the discretization. This means that their efficiency decreases rapidly with an increasing number of discretization cells. Several MCMC approaches have been developed such that the MCMC efficiency does not depend on the discretization of the random field. The pre-conditioned Crank Nicolson Markov Chain Monte Carlo (pCN-MCMC) and the sequential Gibbs (or block-Gibbs) sampling are two examples. In this paper, we will present a combination of the pCN-MCMC and the sequential Gibbs sampling. Our algorithm, the sequential pCN-MCMC, will depend on two tuning-parameters: the correlation parameter β of the pCN approach and the block size κ of the sequential Gibbs approach. The original pCN-MCMC and the Gibbs sampling algorithm are special cases of our method. We present an algorithm that automatically finds the best tuning-parameter combination (κ and β) during the burn-in-phase of the algorithm, thus choosing the best possible hybrid between the two methods. In our test cases, we achieve a speedup factors of 1-5.5 over pCN and of 1-6.5 over Gibbs. Furthermore, we provide the MATLAB implementation of our method as open-source code.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset