Global convergence of optimized adaptive importance samplers

01/02/2022
by   Ömer Deniz Akyıldız, et al.
0

We analyze the optimized adaptive importance sampler (OAIS) for performing Monte Carlo integration with general proposals. We leverage a classical result which shows that the bias and the mean-squared error (MSE) of the importance sampling scales with the χ^2-divergence between the target and the proposal and develop a scheme which performs global optimization of χ^2-divergence. While it is known that this quantity is convex for exponential family proposals, the case of the general proposals has been an open problem. We close this gap by utilizing stochastic gradient Langevin dynamics (SGLD) and its underdamped counterpart for the global optimization of χ^2-divergence and derive nonasymptotic bounds for the MSE by leveraging recent results from non-convex optimization literature. The resulting AIS schemes have explicit theoretical guarantees uniform in the number of iterations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset