Surveys
Several survey articles provide a good starting point for getting an overview of AutoML breakthroughs, approaches, and applications. Among others, we recommend the following surveys:
- “AutoML: A Survey of the State-of-the-Art” by Xin He, Kaiyong Zhao and Xiaowen Chu (published 2019; last updated 2021)
- “Automated machine learning: past, present and future” by Mitra Baratchi, Can Wang, Steffen Limmer, Jan N. van Rijn, Holger Hoos, Thomas Bäck and Markus Olhofer (published in 2024)
- “Automating data science” by Tijl De Bie, Luc De Raedt, José Hernández-Orallo, Holger H. Hoos, Padhraic Smyth and Christopher K. I. Williams (published in 2022)
- “Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges” by Bernd Bischl, Martin Binder, Michel Lang, Tobias Pielok, Jakob Richter, Stefan Coors, Janek Thomas, Theresa Ullmann, Marc Becker, Anne-Laure Boulesteix, Difan Deng and Marius Lindauer (published in 2023)
- “Neural Architecture Search: A Survey” by Thomas Elsken, Jan Hendrik Metzen and Frank Hutter (published in 2019)
- “Neural Architecture Search: Insights from 1000 Papers” by Colin White, Mahmoud Safari, Rhea Sukthanker, Binxin Ru, Thomas Elsken, Arber Zela, Debadeepta Dey and Frank Hutter (published in 2023)
Literature Lists
If you are interested in more unstructured but advanced literature lists, we provide the following pointers:
AutoML
- awesome-automl-papers by Mark Lin
- Awesome-AutoML by Wei Wei
- awesome-AutoML-and-Lightweight-Models by GYChen (Last Update: 10.2020)
Hyperparameter Optimization
Hyperparameter optimization optimizes the machine learning models’ hyperparameters automatically for optimal performance.
- awesome-hpo by Thinklab@SJT (Last Update: 10.2022)
Meta Learning
Meta Learning aims to improve learning across different tasks or datasets instead of specializing on a single one.
- Meta-Learning-Papers by Y-Zou (Last Update: 09.2022)
- Meta-Learning-Papers by Flood Sung (Last Update: 09.2018)
Neural Architecture Search
Neural Architecture Search (NAS) aims to search for the optimal architecture for a target task.
- Literature on Neural Architecture Search by Difan Deng and Marius Lindauer
- awesome-transformer-search by Yash Mehta (Last Update: 07.2023)
- Awesome-AutoDL by Barry (Xuanyi) Dong (Last Update: 09.2022)
- awesome-nas-papers by Jian Yang (Last Update: 07.2021)
- Training-Free-NAS by Ernie Chu (Last Update: 11.2023)
Dynamic Algorithm Configuration
Dynamic Algorithm Configuration (DAC) dynamically configures the algorithms’ hyperparameters during the optimization process.