Estimating 2-Sinkhorn Divergence between Gaussian Processes from Finite-Dimensional Marginals

02/05/2021
by   Anton Mallasto, et al.
0

Optimal Transport (OT) has emerged as an important computational tool in machine learning and computer vision, providing a geometrical framework for studying probability measures. OT unfortunately suffers from the curse of dimensionality and requires regularization for practical computations, of which the entropic regularization is a popular choice, which can be 'unbiased', resulting in a Sinkhorn divergence. In this work, we study the convergence of estimating the 2-Sinkhorn divergence between Gaussian processes (GPs) using their finite-dimensional marginal distributions. We show almost sure convergence of the divergence when the marginals are sampled according to some base measure. Furthermore, we show that using n marginals the estimation error of the divergence scales in a dimension-free way as 𝒪(ϵ^ -1n^-1/2), where ϵ is the magnitude of entropic regularization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro