Skip to Main Content
HBS Home
  • About
  • Academic Programs
  • Alumni
  • Faculty & Research
  • Baker Library
  • Giving
  • Harvard Business Review
  • Initiatives
  • News
  • Recruit
  • Map / Directions
Faculty & Research
  • Faculty
  • Research
  • Featured Topics
  • Academic Units
  • …→
  • Harvard Business School→
  • Faculty & Research→
Publications
Publications
  • Article
  • Proceedings of the International Conference on Machine Learning (ICML)

Mitigating Bias in Adaptive Data Gathering via Differential Privacy

By: Seth Neel and Aaron Leon Roth
  • Format:Print
ShareBar

Abstract

Data that is gathered adaptively—via bandit algorithms, for example—exhibits bias. This is true both when gathering simple numeric valued data—the empirical means kept track of by stochastic bandit algorithms are biased downwards—and when gathering more complicated data—running hypothesis tests on complex data gathered via contextual bandit algorithms leads to false discovery. In this paper, we show that this problem is mitigated if the data collection procedure is differentially private. This lets us both bound the bias of simple numeric valued quantities (like the empirical means of stochastic bandit algorithms), and correct the p-values of hypothesis tests run on the adaptively gathered data. Moreover, there exist differentially private bandit algorithms with near optimal regret bounds: we apply existing theorems in the simple stochastic case, and give a new analysis for linear contextual bandits. We complement our theoretical results with experiments validating our theory.

Keywords

Bandit Algorithms; Bias; Analytics and Data Science; Mathematical Methods; Theory

Citation

Neel, Seth, and Aaron Leon Roth. "Mitigating Bias in Adaptive Data Gathering via Differential Privacy." Proceedings of the International Conference on Machine Learning (ICML) 35th (2018).
  • Read Now

About The Author

Seth Neel

Technology and Operations Management
→More Publications

More from the Authors

    • September 2022 (Revised October 2022)
    • Faculty Research

    Data Privacy in Practice at LinkedIn

    By: Iavor Bojinov, Marco Iansiti and Seth Neel
    • Advances in Neural Information Processing Systems (NeurIPS)

    Adaptive Machine Unlearning

    By: Varun Gupta, Christopher Jung, Seth Neel, Aaron Roth, Saeed Sharifi-Malvajerdi and Chris Waites
    • Mar 2021
    • Faculty Research

    Descent-to-Delete: Gradient-Based Methods for Machine Unlearning

    By: Seth Neel, Aaron Leon Roth and Saeed Sharifi-Malvajerdi
More from the Authors
  • Data Privacy in Practice at LinkedIn By: Iavor Bojinov, Marco Iansiti and Seth Neel
  • Adaptive Machine Unlearning By: Varun Gupta, Christopher Jung, Seth Neel, Aaron Roth, Saeed Sharifi-Malvajerdi and Chris Waites
  • Descent-to-Delete: Gradient-Based Methods for Machine Unlearning By: Seth Neel, Aaron Leon Roth and Saeed Sharifi-Malvajerdi
ǁ
Campus Map
Harvard Business School
Soldiers Field
Boston, MA 02163
→Map & Directions
→More Contact Information
  • Make a Gift
  • Site Map
  • Jobs
  • Harvard University
  • Trademarks
  • Policies
  • Accessibility
  • Digital Accessibility
Copyright © President & Fellows of Harvard College