I'm a PhD candidate in theoretical physics at Harvard where I'm very fortunate to be advised by Cengiz Pehlevan. My research is on understanding algorithms that learn from data through the lens of statistical physics and random matrix theory. This spans neural networks, nonparametric bayesian methods, and kernels. Lately, we have been thinking a lot about scaling laws. Prior to this, I had the pleasure of collaborating with Andy Strominger and colleagues on topics adjacent to string theory and quantum field theory. My work is funded by the NDSEG Fellowship (2019-2022) and by the Hertz Fellowship (2019-2024).

I received my M.S. in mathematics and B.S. in physics from Yale in 2018. There, I did research under David Poland, studying conformal field theories using the bootstrap program, and under Philsang Yoo on the mathematical aspects of quantum field theory and its connection with the Langlands program. Concurrently, I was part of John Murray's lab, where we built tools to study working memory in recurrent neural networks.

I recently worked as a quantitative research intern at Jane Street. I have consulted for two biotech firms: Protein Evolution and Quantum Si as a machine learning scientist. While an undegrad, I interned at Google, applying deep learning to computer vision on the internet of things. I also worked at the Perimeter Institute under Erik Schnetter on tackling the curse of dimensionality in numerical partial differential equations.

My earliest exposure to scientific research was under Dr. James Ellenbogen at MITRE, and with Drs. John Dell and Jonathan Osborne at Thomas Jefferson High School.

**(2024) In Submission **
[arXiv]

**(2024) ICML 2024 **
[arXiv]

**(2023) NeurIPS 2023 **
[OpenReview]
[arXiv]

**(2022) ICLR 2023 **
[OpenReview]
[arXiv]

**(2021) ICLR 2022 **
[OpenReview]
[arXiv]

**(2021) eNeuro **
[Journal Link]
[bioRxiv] [Code Repository]

**(2022) Journal of High Energy Physics **
[Journal Link]
[arXiv]

**(2021) Physical Review D **
[Journal Link]
[arXiv] [IAS Talk]

**(2021) Journal of High Energy Physics **
[Journal Link]
[arXiv]
[Princeton Talk]

**(2018) Journal of High Energy Physics**
[Journal Link]
[arXiv]
[Code Repository]

**(2018) Yale Senior Thesis** [PDF] [Presentation Slides]

**(2017) ** [arXiv] [PDF]
[Code Repository]

**(2017) Physical Review A** [Journal Link] [PDF]

**(Fall 2015)** [PDF]

**(2021-2023)** [Bouchaud & Potters: Random Matrix Theory] [MacKay: Information Theory, Inference, and Learning Algorithms Part 0, Part 1, Part 2] [HST: Elements of Statistical Learning] [Engel and Van Den Broeck: Statistical Mechanics of Learning] [Kardar: Statistical Physics of Fields] [Mezard and Montanari: Information, Physics, and Computation Chapter 1, Chapter 4, Chapter 5, Chapter 8]

**(2019-2020)** [PDF]

**(Spring 2020)** [PDF]

**(Fall 2018)** [PDF]

**(Fall 2018)** [PDF]
[Chapter 1: Black Holes and the Holographic Principle]
[Chapter 2: Matrices and Strings]
[Chapter 3: Holographic Duality]

**(Spring 2018)** [Lecture 1] [Lecture 2]

**(Fall 2017)** [PDF]

**(Spring, Fall 2017)** [Full Notes] [Part 1: Categorical Harmonic Analysis]
[Part 2: Moduli Space of Bundles]
[Part 3: Geometric Satake]
[Part 4: Geometric Representation Theory]
[Part 5: Intro to Derived Algebraic Geometry]
[Part 6: Back to Basics]
[Part 7: Singular Support]
[Part 8: Revisiting D(Bun_G)]
[Part 9: How to study D(Bun_G)]
[Part 10: Factorization Structures]
[Part 11: Fundamental Local Equivalence]

**(Spring, Fall 2017)** [Spring Talk] [Fall Talk]

**(Fall 2016)** [PDF]

**(Fall 2016)** [Review notes on Fiber Bundles]
[Talk 1] [Talk 2]

**(Spring 2016)** [PDF]

**(Spring 2016)** [PDF]

**(Summer 2020)** [Final Paper]

**(Summer 2019)** [Final Presentation]

**(Summer 2016)** [Lecture]

**(Fall 2013)** [PDF]

**(Fall 2013)** [PDF]