Differential Privacy in Personalized Pricing with Nonparametric Demand Models
In the recent decades, the advance of information technology and abundant personal data facilitate the application of algorithmic personalized pricing. However, this leads to the growing concern of potential violation of privacy due to adversarial attack. To address the privacy issue, this paper studies a dynamic personalized pricing problem with unknown nonparametric demand models under data privacy protection. Two concepts of data privacy, which have been widely applied in practices, are introduced: central differential privacy (CDP) and local differential privacy (LDP), which is proved to be stronger than CDP in many cases. We develop two algorithms which make pricing decisions and learn the unknown demand on the fly, while satisfying the CDP and LDP gurantees respectively. In particular, for the algorithm with CDP guarantee, the regret is proved to be at most Õ(T^(d+2)/(d+4)+ε^-1T^d/(d+4)). Here, the parameter T denotes the length of the time horizon, d is the dimension of the personalized information vector, and the key parameter ε>0 measures the strength of privacy (smaller ε indicates a stronger privacy protection). On the other hand, for the algorithm with LDP guarantee, its regret is proved to be at most Õ(ε^-2/(d+2)T^(d+1)/(d+2)), which is near-optimal as we prove a lower bound of Ω(ε^-2/(d+2)T^(d+1)/(d+2)) for any algorithm with LDP guarantee.
READ FULL TEXT