Publications
In my field, authors are ordered alphabetically by last name.
1. Isotropic Noise in Stochastic and Quantum Convex Optimization
with Annie Marsden, Aaron Sidford, and Chenyi Zhang
NeurIPS 2025
2. Solving Zero-Sum Games with Fewer Matrix-Vector Products
with Ishani Karmarkar and Aaron Sidford
FOCS 2025
[arXiv]
Overview: For two decades, computing an \(\epsilon\)-approximate Nash Equilibrium of a zero-sum game in the first-order oracle access model has been subject to a foundational \(\tilde{O}(\epsilon^{-1})\) query complexity barrier. We introduce the first deterministic algorithm to break it. Our work focuses on the fundamental setting where an algorithm accesses the payoff matrix \(A\) via matrix-vector products \((Ax, A^\top y)\), which is equivalent to asking for the players’ expected payoffs. While prominent algorithms such as accelerated gradient descent, optimistic mirror descent, and mirror prox match this long-standing barrier, our method achieves an improved \(\tilde{O} (\epsilon^{-8/9})\) query complexity. This result stems from a new and general framework that also yields improved deterministic complexities for other core machine learning problems, including training hard-margin SVMs and solving linear regression.
3. Extracting Dual Solutions via Primal Optimizers
with Yair Carmon, Arun Jambulapati, and Aaron Sidford
ITCS 2025
[arXiv]
Overview: In certain core optimization and machine learning problems, a gap exists: the best-known algorithms for the min-max and max-min formulations of the problem have different complexities. We close this gap with a general framework that provides efficient, bidirectional reductions between min-max and max-min optimization. Our framework yields new state-of-the-art algorithms for bilinear matrix games and the dual of CVaR distributionally robust optimization (DRO). To demonstrate its versatility, we also apply it to recover the optimal query complexity for computing an approximate stationary point of a convex function.
4. The Burer-Monteiro SDP method can fail even above the Barvinok-Pataki bound
with Vaidehi Srinivas and Aravindan Vijayaraghavan
NeurIPS 2022
[arXiv] [recorded mini talk]
Overview: Semidefinite programs (SDPs) constitute a powerful class of optimization problems with broad applications in machine learning and operations research. We analyze the optimization landscape of the non-convex Burer-Monteiro method, a popular approach for solving large-scale SDPs in practice. Specifically, we construct a family of instances with high-rank spurious local minima, thereby justifying the use of beyond worst-case paradigms such as smoothed analysis to obtain global convergence guarantees. A more detailed overview is available here.