The AutoML community offers a wide variety of open-source packages to meet diverse machine learning needs, whether you’re optimizing neural architectures, tuning hyperparameters, or configuring algorithms. From libraries for Neural Architecture Search (NAS) to powerful hyperparameter optimization (HPO) packages, there’s something for every machine learning workflow. Each package comes with specific features that can be identified using relevant keywords. If you’re looking for an full AutoML package, for example, AutoGluon could be a fit, whereas for Bayesian optimization SMAC, HEBO or Optuna may be more appropriate. Explore these packages based on your project’s needs, and tailor your selection to achieve optimized performance!
Featured AutoML Systems
AutoGluon is an open-source AutoML framework from Amazon that simplifies machine learning for text, tabular, and image data. It provides a user-friendly interface for training highly accurate models with minimal code and supports advanced features like multi-modal learning and ensemble methods.
An AutoML system based on Keras, as a multi-backend deep learning framework, with support for JAX, TensorFlow, and PyTorchs. (Maintained by DATA Lab at Texas A&M University)
TabPFN is an AutoML package designed for (small) tabular datasets, delivering strong classification performance without dataset-specific training. Leveraging a pre-trained Transformer model and causal-inspired priors, it achieves competitive results in seconds.
FLAML is a lightweight AutoML library from Microsoft, designed for efficient and cost-effective model selection and hyperparameter tuning. It provides fast, accurate results with minimal computational resources, making it ideal for developers and researchers.
Featured Optimizers
Optuna is a versatile and efficient hyperparameter optimization framework by Preferred Networks that automates the search for optimal parameters using state-of-the-art techniques. It features an easy-to-use API, support for distributed optimization, and advanced features like pruning to accelerate machine learning workflows.
HEBO (Heteroscedastic Evolutionary Bayesian Optimization) is a cutting-edge framework for hyperparameter optimization that excels in both performance and scalability. It combines Bayesian optimization with evolutionary strategies, making it suitable for complex and high-dimensional search spaces.
BoTorch is a PyTorch-based library for Bayesian optimization, designed to be modular and scalable. It enables flexible and efficient optimization over complex objectives, making it ideal for research and applications in machine learning and beyond.
SMAC3 is a robust tool for hyperparameter optimization and algorithm configuration, combining Bayesian optimization with aggressive search strategies. It is highly flexible, supporting both black-box and multi-objective optimization, and is widely used in machine learning and beyond.
Featured Benchmarks
AMLB (AutoML Benchmark) is a comprehensive framework for benchmarking AutoML systems across diverse datasets and tasks. It provides a standardized, reproducible environment to compare the performance of different AutoML solutions.
CARP-S is a benchmarking framework for Comprehensive Automated Research Performance Studies, designed to evaluate multiple optimizers across various benchmark tasks. It aims to facilitate the development of effective and reliable methods for hyperparameter optimization (HPO).
NASBench
There are several instances of NASBench, which are benchmarks for neural architecture search. They are typically pre-computed or surrogate-based benchmarks such that evaluations are affordable and sustainable, and at the same time ensure better comparability of NAS approaches.
ARLBench is a benchmark for hyperparameter optimization in reinforcement learning, offering efficient evaluation of diverse optimization approaches across various algorithms and environments.
If you are aware of further AutoML packages, please let us know and contribute to automl.space.