What Are the Chances? Explaining the Epsilon Parameter in Differential Privacy

03/01/2023
by   Priyanka Nanayakkara, et al.
0

Differential privacy (DP) is a mathematical privacy notion increasingly deployed across government and industry. With DP, privacy protections are probabilistic: they are bounded by the privacy budget parameter, ϵ. Prior work in health and computational science finds that people struggle to reason about probabilistic risks. Yet, communicating the implications of ϵ to people contributing their data is vital to avoiding privacy theater – presenting meaningless privacy protection as meaningful – and empowering more informed data-sharing decisions. Drawing on best practices in risk communication and usability, we develop three methods to convey probabilistic DP guarantees to end users: two that communicate odds and one offering concrete examples of DP outputs. We quantitatively evaluate these explanation methods in a vignette survey study (n=963) via three metrics: objective risk comprehension, subjective privacy understanding of DP guarantees, and self-efficacy. We find that odds-based explanation methods are more effective than (1) output-based methods and (2) state-of-the-art approaches that gloss over information about ϵ. Further, when offered information about ϵ, respondents are more willing to share their data than when presented with a state-of-the-art DP explanation; this willingness to share is sensitive to ϵ values: as privacy protections weaken, respondents are less likely to share data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro