Private Convex Optimization via Exponential Mechanism

03/01/2022
by   Sivakanth Gopi, et al.
0

In this paper, we study private optimization problems for non-smooth convex functions F(x)=𝔼_i f_i(x) on ℝ^d. We show that modifying the exponential mechanism by adding an ℓ_2^2 regularizer to F(x) and sampling from π(x)∝exp(-k(F(x)+μx_2^2/2)) recovers both the known optimal empirical risk and population loss under (ϵ,δ)-DP. Furthermore, we show how to implement this mechanism using O(n min(d, n)) queries to f_i(x) for the DP-SCO where n is the number of samples/users and d is the ambient dimension. We also give a (nearly) matching lower bound Ω(n min(d, n)) on the number of evaluation queries. Our results utilize the following tools that are of independent interest: (1) We prove Gaussian Differential Privacy (GDP) of the exponential mechanism if the loss function is strongly convex and the perturbation is Lipschitz. Our privacy bound is optimal as it includes the privacy of Gaussian mechanism as a special case and is proved using the isoperimetric inequality for strongly log-concave measures. (2) We show how to sample from exp(-F(x)-μx^2_2/2) for G-Lipschitz F with η error in total variation (TV) distance using O((G^2/μ) log^2(d/η)) unbiased queries to F(x). This is the first sampler whose query complexity has polylogarithmic dependence on both dimension d and accuracy η.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset