Ratio convergence rates for Euclidean first-passage percolation: Applications to the graph infinity Laplacian
In this paper we prove the first quantitative convergence rates for the graph infinity Laplace equation for length scales at the connectivity threshold. In the graph-based semi-supervised learning community this equation is also known as Lipschitz learning. The graph infinity Laplace equation is characterized by the metric on the underlying space, and convergence rates follow from convergence rates for graph distances. At the connectivity threshold, this problem is related to Euclidean first passage percolation, which is concerned with the Euclidean distance function d_h(x,y) on a homogeneous Poisson point process on ℝ^d, where admissible paths have step size at most h>0. Using a suitable regularization of the distance function and subadditivity we prove that d_h_s(0,se_1)/ s →σ as s→∞ almost surely where σ≥ 1 is a dimensional constant and h_s≳log(s)^1/d. A convergence rate is not available due to a lack of approximate superadditivity when h_s→∞. Instead, we prove convergence rates for the ratio d_h(0,se_1)/d_h(0,2se_1)→1/2 when h is frozen and does not depend on s. Combining this with the techniques that we developed in (Bungert, Calder, Roith, IMA Journal of Numerical Analysis, 2022), we show that this notion of ratio convergence is sufficient to establish uniform convergence rates for solutions of the graph infinity Laplace equation at percolation length scales.
READ FULL TEXT