Robust Estimation of Discrete Distributions under Local Differential Privacy

02/14/2022
by   Julien Chhor, et al.
0

Although robust learning and local differential privacy are both widely studied fields of research, combining the two settings is just starting to be explored. We consider the problem of estimating a discrete distribution in total variation from n contaminated data batches under a local differential privacy constraint. A fraction 1-ϵ of the batches contain k i.i.d. samples drawn from a discrete distribution p over d elements. To protect the users' privacy, each of the samples is privatized using an α-locally differentially private mechanism. The remaining ϵ n batches are an adversarial contamination. The minimax rate of estimation under contamination alone, with no privacy, is known to be ϵ/√(k)+√(d/kn), up to a √(log(1/ϵ)) factor. Under the privacy constraint alone, the minimax rate of estimation is √(d^2/α^2 kn). We show that combining the two constraints leads to a minimax estimation rate of ϵ√(d/α^2 k)+√(d^2/α^2 kn) up to a √(log(1/ϵ)) factor, larger than the sum of the two separate rates. We provide a polynomial-time algorithm achieving this bound, as well as a matching information theoretic lower bound.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset