Wireless Federated Learning with Local Differential Privacy

02/12/2020
by   Mohamed Seif, et al.
0

In this paper, we study the problem of federated learning (FL) over a wireless channel, modeled by a Gaussian multiple access channel (MAC), subject to local differential privacy (LDP) constraints. We show that the superposition nature of the wireless channel provides a dual benefit of bandwidth efficient gradient aggregation, in conjunction with strong LDP guarantees for the users. We propose a private wireless gradient aggregation scheme, which shows that when aggregating gradients from K users, the privacy leakage per user scales as O(1/√(K)) compared to orthogonal transmission in which the privacy leakage scales as a constant. We also present analysis for the convergence rate of the proposed private FL aggregation algorithm and study the tradeoffs between wireless resources, convergence, and privacy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset