Skip to Main Content
HBS Home
  • About
  • Academic Programs
  • Alumni
  • Faculty & Research
  • Baker Library
  • Giving
  • Harvard Business Review
  • Initiatives
  • News
  • Recruit
  • Map / Directions
Faculty & Research
  • Faculty
  • Research
  • Featured Topics
  • Academic Units
  • …→
  • Harvard Business School→
  • Faculty & Research→
Publications
Publications
  • Article
  • Proceedings of the International Conference on Machine Learning (ICML)

Oracle Efficient Private Non-Convex Optimization

By: Seth Neel, Aaron Leon Roth, Giuseppe Vietri and Zhiwei Steven Wu
  • Format:Print
ShareBar

Abstract

One of the most effective algorithms for differentially private learning and optimization is objective perturbation. This technique augments a given optimization problem (e.g. deriving from an ERM problem) with a random linear term, and then exactly solves it. However, to date, analyses of this approach crucially rely on the convexity and smoothness of the objective function, limiting its generality. We give two algorithms that extend this approach substantially. The first algorithm requires nothing except boundedness of the loss function, and operates over a discrete domain. Its privacy and accuracy guarantees hold even without assuming convexity. This gives an oracle-efficient optimization algorithm over arbitrary discrete domains that is comparable in its generality to the exponential mechanism. The second algorithm operates over a continuous domain and requires only that the loss function be bounded and Lipschitz in its continuous parameter. Its privacy analysis does not require convexity. Its accuracy analysis does require convexity, but does not require second order conditions like smoothness. Even without convexity, this algorithm can be generically used as an oracle-efficient optimization algorithm, with accuracy evaluated empirically. We complement our theoretical results with an empirical evaluation of the non-convex case, in which we use an integer program solver as our optimization oracle. We find that for the problem of learning linear classifiers, directly optimizing for 0/1 loss using our approach can out-perform the more standard approach of privately optimizing a convex-surrogate loss function on the Adult dataset.

Keywords

Machine Learning; Algorithms; Objective Perturbation; Mathematical Methods

Citation

Neel, Seth, Aaron Leon Roth, Giuseppe Vietri, and Zhiwei Steven Wu. "Oracle Efficient Private Non-Convex Optimization." Proceedings of the International Conference on Machine Learning (ICML) 37th (2020).
  • Read Now

About The Author

Seth Neel

Technology and Operations Management
→More Publications

More from the Authors

    • September 2022 (Revised October 2022)
    • Faculty Research

    Data Privacy in Practice at LinkedIn

    By: Iavor Bojinov, Marco Iansiti and Seth Neel
    • Advances in Neural Information Processing Systems (NeurIPS)

    Adaptive Machine Unlearning

    By: Varun Gupta, Christopher Jung, Seth Neel, Aaron Roth, Saeed Sharifi-Malvajerdi and Chris Waites
    • Mar 2021
    • Faculty Research

    Descent-to-Delete: Gradient-Based Methods for Machine Unlearning

    By: Seth Neel, Aaron Leon Roth and Saeed Sharifi-Malvajerdi
More from the Authors
  • Data Privacy in Practice at LinkedIn By: Iavor Bojinov, Marco Iansiti and Seth Neel
  • Adaptive Machine Unlearning By: Varun Gupta, Christopher Jung, Seth Neel, Aaron Roth, Saeed Sharifi-Malvajerdi and Chris Waites
  • Descent-to-Delete: Gradient-Based Methods for Machine Unlearning By: Seth Neel, Aaron Leon Roth and Saeed Sharifi-Malvajerdi
ǁ
Campus Map
Harvard Business School
Soldiers Field
Boston, MA 02163
→Map & Directions
→More Contact Information
  • Make a Gift
  • Site Map
  • Jobs
  • Harvard University
  • Trademarks
  • Policies
  • Accessibility
  • Digital Accessibility
Copyright © President & Fellows of Harvard College