Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | ICML 2021 Workshop AutoML |
Seitenumfang | 22 |
Publikationsstatus | Elektronisch veröffentlicht (E-Pub) - 2021 |
Abstract
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
ICML 2021 Workshop AutoML. 2021.
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization
AU - Guerrero-Viu, Julia
AU - Hauns, Sven
AU - Izquierdo, Sergio
AU - Miotto, Guilherme
AU - Schrodi, Simon
AU - Biedenkapp, André
AU - Elsken, Thomas
AU - Deng, Difan
AU - Lindauer, Marius
AU - Hutter, Frank
PY - 2021
Y1 - 2021
N2 - Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.
AB - Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.
KW - cs.LG
KW - cs.AI
KW - stat.ML
M3 - Conference contribution
BT - ICML 2021 Workshop AutoML
ER -