Skip to Main Content
HBS Home
  • About
  • Academic Programs
  • Alumni
  • Faculty & Research
  • Baker Library
  • Giving
  • Harvard Business Review
  • Initiatives
  • News
  • Recruit
  • Map / Directions
Faculty & Research
  • Faculty
  • Research
  • Featured Topics
  • Academic Units
  • …→
  • Harvard Business School→
  • Faculty & Research→
Publications
Publications
  • Forthcoming
  • Article
  • Operations Research

Slowly Varying Regression Under Sparsity

By: Dimitris Bertsimas, Vassilis Digalakis Jr, Michael Lingzhi Li and Omar Skali Lami
  • Format:Print
  • | Pages:17
ShareBar

Abstract

We consider the problem of parameter estimation in slowly varying regression models with sparsity constraints. We formulate the problem as a mixed integer optimization problem and demonstrate that it can be reformulated exactly as a binary convex optimization problem through a novel exact relaxation. The relaxation utilizes a new equality on Moore-Penrose inverses that convexifies the non-convex objective function while coinciding with the original objective on all feasible binary points. This allows us to solve the problem significantly more efficiently and to provable optimality using a cutting plane-type algorithm. We develop a highly optimized implementation of such algorithm, which substantially improves upon the asymptotic computational complexity of a straightforward implementation. We further develop a heuristic method that is guaranteed to produce a feasible solution and, as we empirically illustrate, generates high quality warm-start solutions for the binary optimization problem. We show, on both synthetic and real-world datasets, that the resulting algorithm outperforms competing formulations in comparable times across a variety of metrics including out-of-sample predictive performance, support recovery accuracy, and false positive rate. The algorithm enables us to train models with 10,000s of parameters, is robust to noise, and able to effectively capture the underlying slowly changing support of the data generating process.

Keywords

Mathematical Methods; Analytics and Data Science

Citation

Bertsimas, Dimitris, Vassilis Digalakis Jr, Michael Lingzhi Li, and Omar Skali Lami. "Slowly Varying Regression Under Sparsity." Operations Research (forthcoming). (Pre-published online March 27, 2024.)
  • Find it at Harvard
  • Read Now
  • Purchase

About The Author

Michael Lingzhi Li

Technology and Operations Management
→More Publications

More from the Authors

    • 2025
    • Journal of Business & Economic Statistics

    Statistical Inference for Heterogeneous Treatment Effects Discovered by Generic Machine Learning in Randomized Experiments

    By: Kosuke Imai and Michael Lingzhi Li
    • 2024
    • Journal of Causal Inference

    Neyman Meets Causal Machine Learning: Experimental Evaluation of Individualized Treatment Rules

    By: Michael Lingzhi Li and Kosuke Imai
    • 2024
    • Faculty Research

    Learning to Cover: Online Learning and Optimization with Irreversible Decisions

    By: Alexander Jacquillat and Michael Lingzhi Li
More from the Authors
  • Statistical Inference for Heterogeneous Treatment Effects Discovered by Generic Machine Learning in Randomized Experiments By: Kosuke Imai and Michael Lingzhi Li
  • Neyman Meets Causal Machine Learning: Experimental Evaluation of Individualized Treatment Rules By: Michael Lingzhi Li and Kosuke Imai
  • Learning to Cover: Online Learning and Optimization with Irreversible Decisions By: Alexander Jacquillat and Michael Lingzhi Li
ǁ
Campus Map
Harvard Business School
Soldiers Field
Boston, MA 02163
→Map & Directions
→More Contact Information
  • Make a Gift
  • Site Map
  • Jobs
  • Harvard University
  • Trademarks
  • Policies
  • Accessibility
  • Digital Accessibility
Copyright © President & Fellows of Harvard College.