Personalization and Optimization of Decision Parameters via Heterogenous Causal Effects

01/29/2019
by   Ye Tu, et al.
0

Randomized experimentation (also known as A/B testing or bucket testing) is very commonly used in the internet industry to measure the effect of a new treatment. Often, the decision on the basis of such A/B testing is to ramp the treatment variant that did best for the entire population. However, the effect of any given treatment varies across experimental units, and choosing a single variant to ramp to the whole population can be quite suboptimal. In this work, we propose a method which automatically identifies (using causal trees) and exploits (using stochastic optimization) the heterogeneity of a treatment's effect on different experimental units. We use two real-life examples --- one related to serving notifications and the other related to modulating ads density on feed. In both examples, using offline simulation and online experimentation, we demonstrate the benefits of our approach. At the time of writing this paper, the method described has been fully deployed on the LinkedIn ads and notifications system.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset