Details
Original language | English |
---|---|
Publication status | E-pub ahead of print - 16 Aug 2019 |
Externally published | Yes |
Abstract
Keywords
- cs.LG, cs.AI, stat.ML
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
2019.
Research output: Working paper/Preprint › Preprint
}
TY - UNPB
T1 - BOAH: A Tool Suite for Multi-Fidelity Bayesian Optimization & Analysis of Hyperparameters
AU - Lindauer, Marius
AU - Eggensperger, Katharina
AU - Feurer, Matthias
AU - Biedenkapp, André
AU - Marben, Joshua
AU - Müller, Philipp
AU - Hutter, Frank
PY - 2019/8/16
Y1 - 2019/8/16
N2 - Hyperparameter optimization and neural architecture search can become prohibitively expensive for regular black-box Bayesian optimization because the training and evaluation of a single model can easily take several hours. To overcome this, we introduce a comprehensive tool suite for effective multi-fidelity Bayesian optimization and the analysis of its runs. The suite, written in Python, provides a simple way to specify complex design spaces, a robust and efficient combination of Bayesian optimization and HyperBand, and a comprehensive analysis of the optimization process and its outcomes.
AB - Hyperparameter optimization and neural architecture search can become prohibitively expensive for regular black-box Bayesian optimization because the training and evaluation of a single model can easily take several hours. To overcome this, we introduce a comprehensive tool suite for effective multi-fidelity Bayesian optimization and the analysis of its runs. The suite, written in Python, provides a simple way to specify complex design spaces, a robust and efficient combination of Bayesian optimization and HyperBand, and a comprehensive analysis of the optimization process and its outcomes.
KW - cs.LG
KW - cs.AI
KW - stat.ML
M3 - Preprint
BT - BOAH: A Tool Suite for Multi-Fidelity Bayesian Optimization & Analysis of Hyperparameters
ER -