I am currently a PhD student in the Department of Computer Science at the University of California at Berkeley (previously at Cornell) advised by Nika Haghtalab. I am also fortunate to have worked with Lester Mackey at MSR New England during the summer of 2021, Akshay Krishnamurthy and Cyril Zhang at MSR NYC during the summer of 2022 and Parikshit Gopalan at Apple during the summer of 2023. I am honored to be amongst the 2023 cohort of the Apple AI/ML fellowship.
Prior to starting grad school, I was a Research Fellow at Microsoft Research India working with Navin Goyal. Before that, I was an undergraduate majoring in Mathematics at the Indian Institute of Science and worked with Bhavana Kanukurthi on cryptography.
My research focuses on designing mathematical frameworks bringing together the theory and practice of machine learning and using these to developing simple algorithms with provable guarantees on real-world data. Importantly, these frameworks come with design principles that can be used to incorporate important desiderata such as robustness, privacy and fairness into existing pipelines without significant computational and statistical overhead. Some focus areas of my research have been:
I am on the 2023/24 job market. Here is a link to my CV.
Smoothed Analysis of Adaptive Adversaries
with Nika Haghtalab and Tim Roughgarden
Journal of the ACM
FOCS 2021
[Arxiv] [Talk]
Optimal PAC Bounds Without Uniform Convergence
with Ishaq Aden-Ali, Yeshwanth Cherapanamjeri and Nikita Zhivotovskiy
FOCS 2023 (Invited to SICOMP Special Issue)
[Arxiv]
Matrix Discrepancy from Quantum Information
with Sam Hopkins and Prasad Raghavendra
STOC 2022 (Invited to SICOMP Special Issue)
[Arxiv] [Talk]
Oracle-Efficient Online Learning for Beyond Worst-Case Adversaries
with Nika Haghtalab, Yanjun Han and Kunhe Yang
NeuRIPS 2022 Oral
[Arxiv]
Distribution Compression in Nearly-linear Time
with Raaz Dwivedi and Lester Mackey
American Statistical Association SCGS Best Student Paper
ICLR 2022
[Arxiv]
On the Performance of Empirical Risk Minimization with Smoothed Data
with Adam Block and Sasha Rakhlin
Under submission
[Arxiv]
Oracle-Efficient Differentially Private Learning with Public Data
with Adam Block, Mark Bun, Rathin Desai and Steven Wu
Under submission
[Arxiv]
Omniprediction for Regression and the Approximate Rank of Convex Functions
with Parikshit Gopalan, Princewill Okoarofar, Prasad Raghavendra and Mihir Singhal
Under submission
[Arxiv]
Smoothed Nash Equilibria: Algorithms and Complexity
with Constantinos Daskalakis, Nika Haghtalab and Noah Golowich
ITCS 2024
[Arxiv]
Adversarial Resilience in Sequential Prediction via Abstention
with Surbhi Goel, Steve Hanneke and Shay Moran
NeurIPS 2023
[Arxiv]
Smoothed Analysis of Sequential Probability Assignment
with Alankrita Bhatt and Nika Haghtalab
NeurIPS 2023 Spotlight
[Arxiv]
Optimal PAC Bounds Without Uniform Convergence
with Ishaq Aden-Ali, Yeshwanth Cherapanamjeri and Nikita Zhivotovskiy
FOCS 2023
[Arxiv]
The One-Inclusion Graph Algorithm is not Always Optimal
with Ishaq Aden-Ali, Yeshwanth Cherapanamjeri and Nikita Zhivotovskiy
COLT 2023
[Arxiv]
Progressive Knowledge Distillation: Building Ensembles for Efficient Inference
with Don Dennis, Anish Sevekari, Kazuhito Koishida and Virginia Smith
NeurIPS 2023
[Arxiv]
Oracle-Efficient Online Learning for Beyond Worst-Case Adversaries
with Nika Haghtalab, Yanjun Han and Kunhe Yang
NeuRIPS 2022 Oral
[Arxiv]
Matrix Discrepancy from Quantum Information
with Sam Hopkins and Prasad Raghavendra
STOC 2022 (Invited to SICOMP Special Issue)
[Arxiv] [Talk]
Smoothed Analysis of Adaptive Adversaries
with Nika Haghtalab and Tim Roughgarden
FOCS 2021
[Arxiv] [Talk]
Distribution Compression in Nearly-linear Time
with Raaz Dwivedi and Lester Mackey
American Statistical Association SCGS Best Student Paper:
ICLR 2022
[Arxiv]
Smoothed Analysis of Online and Differentially Private Learning
with Nika Haghtalab and Tim Roughgarden
NeurIPS 2020 Spotlight
[Arxiv]
Fractional Pseudorandom Generators from Any Fourier Level
with Eshan Chattopadhyay, Jason Gaitonde, Chin Ho Lee and Shachar Lovett
CCC 2021
[Arxiv]
Effect of Activation Functions on the Training of Overparametrized Neural Nets
with Abhishek Panigrahi and Navin Goyal
ICLR 2020
[Arxiv]
Sampling and Optimization in Convex Sets on Manifolds with Nonnegative Curvature
with Navin Goyal
COLT 2019
[Arxiv]
Non-Gaussian Component Analysis by Entropy Methods
with Navin Goyal
STOC 2019
[Arxiv]
Exponential Weights on the Hypercube in Polynomial Time
with Sudeep Raja Putta
AISTATS 2019
[Arxiv]
Powered by Jekyll and Minimal Light theme.