OpenEuroLLM - Pre-Training for Large Language Models
As part of OpenEuroLLM, Aaron leads a team focused on large-scale pre-training of large language models. His research aims to develop automated methods to make LLM pre-training more resource-efficient.
Before joining the ELLIS Institute Tübingen, he led the AutoML research group at ScaDS.AI (Center for Scalable Data Analytics and Artificial Intelligence) at the University of Leipzig. Previously, he worked as an Applied Scientist at AWS, contributing to the science teams behind SageMaker and Amazon Q.
Aaron earned his PhD in 2019 at the University of Freiburg under the supervision of Frank Hutter. His work has received multiple recognitions, including the 2022 AutoML Conference Best Paper Award and the 2015 ChaLearn AutoML Challenge win. He also co-hosts the virtual AutoML Seminar.
For more information, visit Aaron’s personal website or Google Scholar profile.
Selected Publications:
- NAS-Bench-101: Towards Reproducible Neural Architecture Search
- BOHB: Robust and Efficient Hyperparameter Optimization at Scale
- Efficient and Robust Automated Machine Learning
- Syne tune: A library for large scale hyperparameter tuning and reproducible research