Private Distributed Mean Estimation

06/22/2020
by   Lun Wang, et al.
0

Ever since its proposal, differential privacy has become the golden standard for rigorous privacy protection. Output perturbation is the most widely used differentially private mechanism. It works by adding calibrated noise drawn from the real domain to the output. However, the finite computers can only represent real numbers to a given machine precision. To guarantee the strictness of differential privacy under fixed precision, several pioneering works proposed to use noise drawn from discrete for differential privacy. However, discrete differentially private mechanisms are not used in dense tasks such as deep learning because of the lack of tight composition theorem like moments accountant. In this paper, we prove Rényi differential privacy for discrete Gaussian mechanism. Thus, we can directly use analytical moments accountant (<cit.>) to provide tight composition for distributed mean estimation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset