Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Julia Guerrero-Viu
  • Sven Hauns
  • Sergio Izquierdo
  • Guilherme Miotto
  • Simon Schrodi
  • André Biedenkapp
  • Thomas Elsken
  • Difan Deng
  • Marius Lindauer
  • Frank Hutter

Externe Organisationen

  • Albert-Ludwigs-Universität Freiburg
  • Bosch Center for Artificial Intelligence (BCAI)
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksICML 2021 Workshop AutoML
Seitenumfang22
PublikationsstatusElektronisch veröffentlicht (E-Pub) - 2021

Abstract

Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.

Zitieren

Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization. / Guerrero-Viu, Julia; Hauns, Sven; Izquierdo, Sergio et al.
ICML 2021 Workshop AutoML. 2021.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Guerrero-Viu, J, Hauns, S, Izquierdo, S, Miotto, G, Schrodi, S, Biedenkapp, A, Elsken, T, Deng, D, Lindauer, M & Hutter, F 2021, Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization. in ICML 2021 Workshop AutoML. <https://arxiv.org/abs/2105.01015>
Guerrero-Viu, J., Hauns, S., Izquierdo, S., Miotto, G., Schrodi, S., Biedenkapp, A., Elsken, T., Deng, D., Lindauer, M., & Hutter, F. (2021). Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization. In ICML 2021 Workshop AutoML Vorabveröffentlichung online. https://arxiv.org/abs/2105.01015
Guerrero-Viu J, Hauns S, Izquierdo S, Miotto G, Schrodi S, Biedenkapp A et al. Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization. in ICML 2021 Workshop AutoML. 2021 Epub 2021.
Guerrero-Viu, Julia ; Hauns, Sven ; Izquierdo, Sergio et al. / Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization. ICML 2021 Workshop AutoML. 2021.
Download
@inproceedings{9fa191fe19124ce19db5ca8a6cacd71e,
title = "Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization",
abstract = " Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines. ",
keywords = "cs.LG, cs.AI, stat.ML",
author = "Julia Guerrero-Viu and Sven Hauns and Sergio Izquierdo and Guilherme Miotto and Simon Schrodi and Andr{\'e} Biedenkapp and Thomas Elsken and Difan Deng and Marius Lindauer and Frank Hutter",
year = "2021",
language = "English",
booktitle = "ICML 2021 Workshop AutoML",

}

Download

TY - GEN

T1 - Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

AU - Guerrero-Viu, Julia

AU - Hauns, Sven

AU - Izquierdo, Sergio

AU - Miotto, Guilherme

AU - Schrodi, Simon

AU - Biedenkapp, André

AU - Elsken, Thomas

AU - Deng, Difan

AU - Lindauer, Marius

AU - Hutter, Frank

PY - 2021

Y1 - 2021

N2 - Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.

AB - Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.

KW - cs.LG

KW - cs.AI

KW - stat.ML

M3 - Conference contribution

BT - ICML 2021 Workshop AutoML

ER -

Von denselben Autoren