Learn to Forget: User-Level Memorization Elimination in Federated Learning

03/24/2020
by   Yang Liu, et al.
69

Federated learning is a decentralized machine learning technique that evokes widespread attention in both the research field and the real-world market. However, the current privacy-preserving federated learning scheme only provides a secure way for the users to contribute their private data but never leaves a way to withdraw the contribution to model update. Such an irreversible setting potentially breaks the regulations about data protection and increases the risk of data extraction. To resolve the problem, this paper describes a novel concept for federated learning, called memorization elimination. Based on the concept, we propose , a federated learning framework that allows the user to eliminate the memorization of its private data in the trained model. Specifically, each user in is deployed with a trainable dummy gradient generator. After steps of training, the generator can produce dummy gradients to stimulate the neurons of a machine learning model to eliminate the memorization of the specific data. Also, we prove that the additional memorization elimination service of does not break the common procedure of federated learning or lower its security.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset