Publications
Publications
- 2023
- HBS Working Paper Series
Debiasing Treatment Effect Estimation for Privacy-Protected Data: A Model Auditing and Calibration Approach
By: Ta-Wei Huang and Eva Ascarza
Abstract
Data-driven targeted interventions have become a powerful tool for organizations to optimize business outcomes
by utilizing individual-level data from experiments. A key element of this process is the estimation
of Conditional Average Treatment Effects (CATE), which enables organizations to effectively identify differences
in customer sensitivities to interventions. However, with the growing importance of data privacy,
organizations are increasingly adopting Local Differential Privacy (LDP)—a privacy-preserving method
that injects calibrated noise into individual records during the data collection process. Despite its privacy protection
benefits, we show that LDP can significantly compromise the predictive accuracy of CATE models
and introduce biases, thereby undermining the effectiveness of targeted interventions. To overcome this
challenge, we introduce a model auditing and calibration approach that improves CATE predictions while
preserving privacy protections. Built on recent advancements in cross-fitting, gradient boosting, and multicalibration,
our method improves model accuracy by iteratively correcting errors in the CATE predictions
without the need for data denoising. As a result, we can improve CATE predictions while maintaining the
same level of privacy protection. Furthermore, we develop a novel local learning with global optimization
approach to mitigate the bias introduced by LDP noise and overfitting during the error correction process.
Our methodology, validated with simulation analyses and two real-world marketing experiments, demonstrates
superior predictive accuracy and targeting performance compared to existing methods and alternative
benchmarks. Our approach empowers organizations to deliver more precise targeted interventions while complying
with privacy regulations and concerns.
Keywords
Citation
Huang, Ta-Wei, and Eva Ascarza. "Debiasing Treatment Effect Estimation for Privacy-Protected Data: A Model Auditing and Calibration Approach." Harvard Business School Working Paper, No. 24-034, December 2023.