On stochastic gradient Langevin dynamics with dependent data streams in the logconcave case

12/06/2018
by   M. Barkhagen, et al.
0

Stochastic Gradient Langevin Dynamics (SGLD) is a combination of a Robbins-Monro type algorithm with Langevin dynamics in order to perform data-driven stochastic optimization. In this paper, the SGLD method with fixed step size λ is considered in order to sample from a logconcave target distribution π, known up to a normalisation factor. We assume that unbiased estimates of the gradient from possibly dependent observations are available. It is shown that, for all ε>0, the Wasserstein-2 distance of the nth iterate of the SGLD algorithm from π is dominated by c_1(ε)[λ^1/2 - ε+e^-aλ n] with appropriate constants c_1(ε), a>0.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset