Details
Original language | English |
---|---|
Title of host publication | ICML 2021 Workshop AutoML |
Number of pages | 22 |
Publication status | E-pub ahead of print - 2021 |
Abstract
Keywords
- cs.LG, cs.AI, stat.ML
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
ICML 2021 Workshop AutoML. 2021.
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization
AU - Guerrero-Viu, Julia
AU - Hauns, Sven
AU - Izquierdo, Sergio
AU - Miotto, Guilherme
AU - Schrodi, Simon
AU - Biedenkapp, André
AU - Elsken, Thomas
AU - Deng, Difan
AU - Lindauer, Marius
AU - Hutter, Frank
PY - 2021
Y1 - 2021
N2 - Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.
AB - Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.
KW - cs.LG
KW - cs.AI
KW - stat.ML
M3 - Conference contribution
BT - ICML 2021 Workshop AutoML
ER -