Doubly Regularized Entropic Wasserstein Barycenters

03/21/2023
by   Lenaïc Chizat, et al.
0

We study a general formulation of regularized Wasserstein barycenters that enjoys favorable regularity, approximation, stability and (grid-free) optimization properties. This barycenter is defined as the unique probability measure that minimizes the sum of entropic optimal transport (EOT) costs with respect to a family of given probability measures, plus an entropy term. We denote it (λ,τ)-barycenter, where λ is the inner regularization strength and τ the outer one. This formulation recovers several previously proposed EOT barycenters for various choices of λ,τ≥ 0 and generalizes them. First, in spite of – and in fact owing to – being doubly regularized, we show that our formulation is debiased for τ=λ/2: the suboptimality in the (unregularized) Wasserstein barycenter objective is, for smooth densities, of the order of the strength λ^2 of entropic regularization, instead of max{λ,τ} in general. We discuss this phenomenon for isotropic Gaussians where all (λ,τ)-barycenters have closed form. Second, we show that for λ,τ>0, this barycenter has a smooth density and is strongly stable under perturbation of the marginals. In particular, it can be estimated efficiently: given n samples from each of the probability measures, it converges in relative entropy to the population barycenter at a rate n^-1/2. And finally, this formulation lends itself naturally to a grid-free optimization algorithm: we propose a simple noisy particle gradient descent which, in the mean-field limit, converges globally at an exponential rate to the barycenter.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro