Optimal-Degree Polynomial Approximations for Exponentials and Gaussian Kernel Density Estimation

05/12/2022
by   Amol Aggarwal, et al.
0

For any real numbers B ≥ 1 and δ∈ (0, 1) and function f: [0, B] →ℝ, let d_B; δ (f) ∈ℤ_> 0 denote the minimum degree of a polynomial p(x) satisfying sup_x ∈ [0, B]| p(x) - f(x) | < δ. In this paper, we provide precise asymptotics for d_B; δ (e^-x) and d_B; δ (e^x) in terms of both B and δ, improving both the previously known upper bounds and lower bounds. In particular, we show d_B; δ (e^-x) = Θ( max{√(B log(δ^-1)), log(δ^-1) /log(B^-1log(δ^-1))}), and d_B; δ (e^x) = Θ( max{ B, log(δ^-1) /log(B^-1log(δ^-1))}). Polynomial approximations for e^-x and e^x have applications to the design of algorithms for many problems, and our degree bounds show both the power and limitations of these algorithms. We focus in particular on the Batch Gaussian Kernel Density Estimation problem for n sample points in Θ(log n) dimensions with error δ = n^-Θ(1). We show that the running time one can achieve depends on the square of the diameter of the point set, B, with a transition at B = Θ(log n) mirroring the corresponding transition in d_B; δ (e^-x): - When B=o(log n), we give the first algorithm running in time n^1 + o(1). - When B = κlog n for a small constant κ>0, we give an algorithm running in time n^1 + O(loglogκ^-1 /logκ^-1). The loglogκ^-1 /logκ^-1 term in the exponent comes from analyzing the behavior of the leading constant in our computation of d_B; δ (e^-x). - When B = ω(log n), we show that time n^2 - o(1) is necessary assuming SETH.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset