Differentially Private Nonparametric Regression Under a Growth Condition

11/24/2021
by   Noah Golowich, et al.
0

Given a real-valued hypothesis class ℋ, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from ℋ given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of ℋ is necessary for private learnability. Here online learnability of ℋ is characterized by the finiteness of its η-sequential fat shattering dimension, sfat_η(ℋ), for all η > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that ℋ is privately learnable if lim_η↓ 0 sfat_η(ℋ) is finite, which is a fairly restrictive condition. We show that under the relaxed condition liminf_η↓ 0η· sfat_η(ℋ) = 0, ℋ is privately learnable, establishing the first nonparametric private learnability guarantee for classes ℋ with sfat_η(ℋ) diverging as η↓ 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro