Simon Batzner
Contact
E-Mail: echo "moc.liamg@renztabnomis" | rev
Research Interests
My interests lie at the intersection of Deep Learning and Physics, in particular on the role of structure and symmetry in Deep Learning.
News
- 05/2023: New Scientist covered our work on large-scale biomolecular dynamics.
- 04/2023: The final paper from my PhD is now public. We show how the state-of-the-art accuracy of Allegro can be scaled to large biomolecular systems, up to the all-atom, fully solvated HIV capsid at 44 million atoms. We include strong and weak scaling results on a large pretrained Allegro potential and show top speeds of >100 steps/second on large biomolecular systems.
- 04/2023: I've joined Google Brain (now Google DeepMind) in San Francisco. I look forward to continue working with Ekin Dogus Cubuk and team.
- 03/2023: I've defended my Ph.D.
- 03/2023: New preprint: we introduce TM23, a new ML potential benchmark of transition metals and test the performance of NequIP and FLARE.
- 02/2023: Allegro was selected as an Editor's highlight in Nature Communications.
- 02/2023: Our Allegro paper is now published in Nature Communications
- 01/2023: I will be giving an invited talk in the AI4Science talk series hosted by the University of Stuttgart and NEC Labs Europe.
- 01/2023: I will be giving an invited talk at the Department of Physics, Astronomy, and Materials Science at Missouri State University.
- 01/2023: We’re organizing an ICLR 2023 workshop on ML4Materials.
- 11/2022: Preprint alert: We have posted a preprint on a method to obtain fast uncertainty estimates in Deep Learning Interatomic Potentials, led by our undergraduate researcher Albert Zhu.
- 11/2022: We will be at MRS with 5 accepted NequIP+Allegro-related talks! Come hang out.
- 10/2022: I will continue at Google Brain as a Student Researcher over the fall.
- 06/2022: I started an internship at Google Brain in SF, working with Ekin Dogus Cubuk.
- 05/2022: I won the Best Student Presentation Award in the ML Symposium at MRS '22. Thank you to the organizers for an excellent symposium
- 05/2022: We're organizing the Swiss Equivariant Learning Workshop from July 11th - 14th 2022 in Lausanne, sign up here by June 24th.
- 05/2022: Preprint alert: We have posted a preprint on a unifying theory of E(3)-equivariant interatomic potentials. Joint work with the group of Gábor Csányi as well as Christoph Ortner and Ralf Drautz.
- 05/2022: Our NequIP paper is now published in Nature Commmunications
- 05/2022: The Allegro code is now public on our group's GitHub
- 04/2022: My undergraduate university hosted me on their Made in Science podcast
- 04/2022: I will be giving an invited talk at EPFL in the Chemistry department
- 04/2022: I will be giving an invited talk in the Logag reading group covering the Allegro preprint.
- 04/2022: Preprint alert: We have posted a preprint describing our new Machine Learning Interatomic Potential called Allegro
- 03/2022: Our work on multi-task learning of collective variables was published in JCTC, led by Lixin Sun.
- 03/2022: There are seven NequIP-related talks at APS March Meeting 2022 in Chicago, both from our group and others.
- 03/2022: I will be giving an invited symposium talk at APS March Meeting 2022 in Chicago on Equivariant Interatomic Potentials
- 01/2022: I will be joining the team at Google Brain for the summer of 2022. I look forward to working with Ekin Dogus Cubuk.
- 12/2021: We have posted an updated version of our NequIP preprint
- 12/2021: We have a tutorial and 4 accepted NequIP-related talks at MRS Fall 2021 in Boston!
- 11/2021: We have released v0.5 of our NequIP code
About
I am a Research Scientist at Google DeepMind in San Francisco. I recently defended my PhD from Harvard University, where I spent 4 years in the group of Boris Kozinsky as well as six months at Google Brain working with Ekin Dogus Cubuk. Prior to Harvard, I obtained a Master's from MIT, where I wrote a thesis on equivariant neural networks. During my undergrad, I spent a year in Los Angeles, working on the NASA mission SOFIA, where I wrote software for analyzing telescope data and used ML to model the dynamics of piezolelectrics. I obtained my Bachelor's from the University of Stuttgart, Germany. I am originally from a small, but beautiful town a few minutes from the Bavarian Alps.
Education
- 2019-2023, Harvard University, Ph.D.
- 2017-2019, Massachusetts Institute of Technology, S.M.
- 2016-2017, DSI at the NASA Armstrong Research Center, Thesis Candidate
- 2013-2017, University of Stuttgart, Germany, B.Sc.
Publications
Here is a link to my Google Scholar.
Resume
Here is a link to my Linkedin.
Invited talks
- 2023, University of Stuttgart and NEC Labs, Europe
- 2023, Missouri State University
- 2022, Meta AI
- 2022, EPFL, Institute of Chemical Sciences and Engineering
- 2022, Learning on Graphs and Geometry Reading Group (recording | slides)
- 2022, APS March Meeting, Invited Symposium Speaker
- 2022, Aqemia
- 2022, Odyssey Therapeutics
- 2021, Carnegie Mellon University, SciML Speaker Series (slides)
- 2021, University of Minnesota
- 2021, University of California at Berkeley, Teresa and Martin Head-Gordon groups
- 2021, Comenius University, Bratislava, Slovakia, Martonak group
- 2021, University of Cambridge, Machine Learning Discussion Group
- 2021, MILA, Quebec AI Institue, LambdaZero team
Software
During my PhD I published two codes for the NequIP and Allegro potential. Both are public on my PhD group's Github. If you have questions, please reach out to the current developers on the GitHub, they are happy to help:
Both of these also come with a LAMMPS pair_style, written by my brilliant former labmate Anders Johansson, which can be found here:
Referee Activity
Reviewer for Nature Computational Science, NeurIPS, ICML, Nature Communications, Communications Chemistry, Journal of Chemical Theory and Computation, ACS Nano
Teaching
- Harvard AP 275, Spring 2021: Computational Design of Materials (TF)
- MIT 2.086, Spring 2019: Numerical Computation for Mechanical Engineers (TA)
- University of Stuttgart, 2014/2015: Higher Mathematics 1 + 2 (Tutor)