Simon Batzner
Contact
E-Mail: echo "ude.dravrah.g@renztab" | rev
Simon Batzner
Harvard University
29 Oxford St Pierce Hall, Room 302
Cambridge, MA 02138
Research Interests
My interests lie at the intersection of Deep Learning and Physics. More specifically, I am interested in improving and accelerating Molecular Simulation with the help of Deep Learning. A core focus of my work has been on the role of symmetry. In particular, my Ph.D. work introduced E(3)-equivariant Machine Learning Interatomic Potentials and was the first to demonstrate the large improvements they can yield in sample efficiency, accuracy, and out-of-distribution generalization. More recently, my interests have focused on scalability, generalization to out-of-distribution data, and theory of equivariant neural networks. Most notably, our team was able to scale a state-of-the-art equivariant ML potential to >100,000,000 atoms on as little as 128 GPUs while maintaining high throughput. Our approach has focused heavily on both designing novel algorithms for potentials while also implementing them in fast, massively scalable, and easy-to-use software that is widely used by researchers around the globe.
News
- 02/2023: Our Allegro paper is now published in Nature Communications
- 01/2023: I will be giving an invited talk in the AI4Science talk series hosted by the University of Stuttgart and NEC Labs Europe.
- 01/2023: I will be giving an invited talk at the Department of Physics, Astronomy, and Materials Science at Missouri State University.
- 01/2023: We’re organizing an ICLR 2023 workshop on ML4Materials.
- 11/2022: Preprint alert: We have posted a preprint on a method to obtain fast uncertainty estimates in Deep Learning Interatomic Potentials, led by our undergraduate researcher Albert Zhu.
- 11/2022: We will be at MRS with 5 accepted NequIP+Allegro-related talks! Come hang out.
- 10/2022: I will continue at Google Brain as a Student Researcher over the fall.
- 06/2022: I started an internship at Google Brain in SF, working with Ekin Dogus Cubuk.
- 05/2022: I won the Best Student Presentation Award in the ML Symposium at MRS '22. Thank you to the organizers for an excellent symposium
- 05/2022: We're organizing the Swiss Equivariant Learning Workshop from July 11th - 14th 2022 in Lausanne, sign up here by June 24th.
- 05/2022: Preprint alert: We have posted a preprint on a unifying theory of E(3)-equivariant interatomic potentials. Joint work with the group of Gábor Csányi as well as Christoph Ortner and Ralf Drautz.
- 05/2022: Our NequIP paper is now published in Nature Commmunications
- 05/2022: The Allegro code is now public on our group's GitHub
- 04/2022: My undergraduate university hosted me on their Made in Science podcast
- 04/2022: I will be giving an invited talk at EPFL in the Chemistry department
- 04/2022: I will be giving an invited talk in the Logag reading group covering the Allegro preprint.
- 04/2022: Preprint alert: We have posted a preprint describing our new Machine Learning Interatomic Potential called Allegro
- 03/2022: Our work on multi-task learning of collective variables was published in JCTC, led by Lixin Sun.
- 03/2022: There are seven NequIP-related talks at APS March Meeting 2022 in Chicago, both from our group and others.
- 03/2022: I will be giving an invited symposium talk at APS March Meeting 2022 in Chicago on Equivariant Interatomic Potentials
- 01/2022: I will be joining the team at Google Brain for the summer of 2022. I look forward to working with Ekin Dogus Cubuk.
- 12/2021: We have posted an updated version of our NequIP preprint
- 12/2021: We have a tutorial and 4 accepted NequIP-related talks at MRS Fall 2021 in Boston!
- 11/2021: We have released v0.5 of our NequIP code
About
I am a fourth-year PhD student at Harvard University, fortunate to be advised by Boris Kozinsky.
Prior to joining Harvard, I obtained a Master's from MIT, where I worked with Alexie Kolpak and Boris Kozinsky. At MIT, I also wrote a thesis on equivariant neural networks. Before that, I spent a year in Los Angeles, working on the NASA mission SOFIA, where I wrote software for analyzing telescope data and used ML to model the dynamics of piezolelectrics. I obtained my Bachelor's from the University of Stuttgart, Germany. I am originally from a small, but beautiful town a few minutes from the Bavarian Alps.
Education
- since 2019, Harvard University, Ph.D. candidate
- 2017-2019, Massachusetts Institute of Technology, S.M.
- 2016-2017, DSI at the NASA Armstrong Research Center, Thesis Candidate
- 2013-2017, University of Stuttgart, Germany, B.Sc.
Publications
Here is a link to my Google Scholar.
Resume
Here is a link to my Linkedin.
Invited talks
- 2023, University of Stuttgart and NEC Labs, Europe
- 2023, Missouri State University
- 2022, Meta AI
- 2022, EPFL, Institute of Chemical Sciences and Engineering
- 2022, Learning on Graphs and Geometry Reading Group (recording | slides)
- 2022, APS March Meeting, Invited Symposium Speaker
- 2022, Aqemia
- 2022, Odyssey Therapeutics
- 2021, Carnegie Mellon University, SciML Speaker Series (slides)
- 2021, University of Minnesota
- 2021, University of California at Berkeley, Teresa and Martin Head-Gordon groups
- 2021, Comenius University, Bratislava, Slovakia, Martonak group
- 2021, University of Cambridge, Machine Learning Discussion Group
- 2021, MILA, Quebec AI Institue, LambdaZero team
Software
We have published two codes for the NequIP and Allegro potential. Both are public on our group's Github. If you have questions, please reach out, we are happy to help:
Both of these also come with a LAMMPS pair_style, which can be found here:
Teaching
- Harvard AP 275, Spring 2021: Computational Design of Materials (TF)
- MIT 2.086, Spring 2019: Numerical Computation for Mechanical Engineers (TA)
- University of Stuttgart, 2014/2015: Higher Mathematics 1 + 2 (Tutor)