The Influence of Dropout on Membership Inference in Differentially Private Models

03/16/2021
by   Erick Galinkin, et al.
0

Differentially private models seek to protect the privacy of data the model is trained on, making it an important component of model security and privacy. At the same time, data scientists and machine learning engineers seek to use uncertainty quantification methods to ensure models are as useful and actionable as possible. We explore the tension between uncertainty quantification via dropout and privacy by conducting membership inference attacks against models with and without differential privacy. We find that models with large dropout slightly increases a model's risk to succumbing to membership inference attacks in all cases including in differentially private models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset