Efficient benchmarking of algorithm configurators via model-based surrogates

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autorschaft

  • Katharina Eggensperger
  • Marius Lindauer
  • Holger H. Hoos
  • Frank Hutter
  • Kevin Leyton-Brown

Externe Organisationen

  • Albert-Ludwigs-Universität Freiburg
  • University of British Columbia
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten (von - bis)15-41
Seitenumfang27
FachzeitschriftMachine learning
Jahrgang107
Ausgabenummer1
Frühes Online-Datum22 Dez. 2017
PublikationsstatusVeröffentlicht - Jan. 2018
Extern publiziertJa

Abstract

The optimization of algorithm (hyper-)parameters is crucial for achieving peak performance across a wide range of domains, ranging from deep neural networks to solvers for hard combinatorial problems. However, the proper evaluation of new algorithm configuration (AC) procedures (or configurators) is hindered by two key hurdles. First, AC scenarios are hard to set up, including the target algorithm to be optimized and the problem instances to be solved. Second, and even more significantly, they are computationally expensive: a single configurator run involves many costly runs of the target algorithm. Here, we propose a benchmarking approach that uses surrogate scenarios, which are computationally cheap while remaining close to the original AC scenarios. These surrogate scenarios approximate the response surface corresponding to true target algorithm performance using a regression model. In our experiments, we construct and evaluate surrogate scenarios for hyperparameter optimization as well as for AC problems that involve performance optimization of solvers for hard combinatorial problems. We generalize previous work by building surrogates for AC scenarios with multiple problem instances, stochastic target algorithms and censored running time observations. We show that our surrogate scenarios capture overall important characteristics of the original AC scenarios from which they were derived, while being much easier to use and orders of magnitude cheaper to evaluate.

ASJC Scopus Sachgebiete

Zitieren

Efficient benchmarking of algorithm configurators via model-based surrogates. / Eggensperger, Katharina; Lindauer, Marius; Hoos, Holger H. et al.
in: Machine learning, Jahrgang 107, Nr. 1, 01.2018, S. 15-41.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Eggensperger K, Lindauer M, Hoos HH, Hutter F, Leyton-Brown K. Efficient benchmarking of algorithm configurators via model-based surrogates. Machine learning. 2018 Jan;107(1):15-41. Epub 2017 Dez 22. doi: 10.1007/s10994-017-5683-z
Eggensperger, Katharina ; Lindauer, Marius ; Hoos, Holger H. et al. / Efficient benchmarking of algorithm configurators via model-based surrogates. in: Machine learning. 2018 ; Jahrgang 107, Nr. 1. S. 15-41.
Download
@article{db92fb7ca9b64d6692d1b41b1f5d7f2a,
title = "Efficient benchmarking of algorithm configurators via model-based surrogates",
abstract = "The optimization of algorithm (hyper-)parameters is crucial for achieving peak performance across a wide range of domains, ranging from deep neural networks to solvers for hard combinatorial problems. However, the proper evaluation of new algorithm configuration (AC) procedures (or configurators) is hindered by two key hurdles. First, AC scenarios are hard to set up, including the target algorithm to be optimized and the problem instances to be solved. Second, and even more significantly, they are computationally expensive: a single configurator run involves many costly runs of the target algorithm. Here, we propose a benchmarking approach that uses surrogate scenarios, which are computationally cheap while remaining close to the original AC scenarios. These surrogate scenarios approximate the response surface corresponding to true target algorithm performance using a regression model. In our experiments, we construct and evaluate surrogate scenarios for hyperparameter optimization as well as for AC problems that involve performance optimization of solvers for hard combinatorial problems. We generalize previous work by building surrogates for AC scenarios with multiple problem instances, stochastic target algorithms and censored running time observations. We show that our surrogate scenarios capture overall important characteristics of the original AC scenarios from which they were derived, while being much easier to use and orders of magnitude cheaper to evaluate.",
keywords = "Algorithm configuration, Empirical performance model, Hyper-parameter optimization",
author = "Katharina Eggensperger and Marius Lindauer and Hoos, {Holger H.} and Frank Hutter and Kevin Leyton-Brown",
note = "Funding information: We thank Stefan Falkner for the implementation of the quantile regression forest used in our experiments and for fruitful discussions on early drafts of the paper. K. Eggensperger, M. Lindauer and F. Hutter acknowledge funding by the DFG (German Research Foundation) under Emmy Noether Grant HU 1900/2-1; K. Eggensperger also acknowledges funding by the State Graduate Funding Program of Baden-W{\"u}rttemberg. H. Hoos and K. Leyton-Brown acknowledge funding through NSERC Discovery Grants; K. Leyton-Brown also acknowledges funding from an NSERC E.W.R. Steacie Fellowship.",
year = "2018",
month = jan,
doi = "10.1007/s10994-017-5683-z",
language = "English",
volume = "107",
pages = "15--41",
journal = "Machine learning",
issn = "0885-6125",
publisher = "Springer Netherlands",
number = "1",

}

Download

TY - JOUR

T1 - Efficient benchmarking of algorithm configurators via model-based surrogates

AU - Eggensperger, Katharina

AU - Lindauer, Marius

AU - Hoos, Holger H.

AU - Hutter, Frank

AU - Leyton-Brown, Kevin

N1 - Funding information: We thank Stefan Falkner for the implementation of the quantile regression forest used in our experiments and for fruitful discussions on early drafts of the paper. K. Eggensperger, M. Lindauer and F. Hutter acknowledge funding by the DFG (German Research Foundation) under Emmy Noether Grant HU 1900/2-1; K. Eggensperger also acknowledges funding by the State Graduate Funding Program of Baden-Württemberg. H. Hoos and K. Leyton-Brown acknowledge funding through NSERC Discovery Grants; K. Leyton-Brown also acknowledges funding from an NSERC E.W.R. Steacie Fellowship.

PY - 2018/1

Y1 - 2018/1

N2 - The optimization of algorithm (hyper-)parameters is crucial for achieving peak performance across a wide range of domains, ranging from deep neural networks to solvers for hard combinatorial problems. However, the proper evaluation of new algorithm configuration (AC) procedures (or configurators) is hindered by two key hurdles. First, AC scenarios are hard to set up, including the target algorithm to be optimized and the problem instances to be solved. Second, and even more significantly, they are computationally expensive: a single configurator run involves many costly runs of the target algorithm. Here, we propose a benchmarking approach that uses surrogate scenarios, which are computationally cheap while remaining close to the original AC scenarios. These surrogate scenarios approximate the response surface corresponding to true target algorithm performance using a regression model. In our experiments, we construct and evaluate surrogate scenarios for hyperparameter optimization as well as for AC problems that involve performance optimization of solvers for hard combinatorial problems. We generalize previous work by building surrogates for AC scenarios with multiple problem instances, stochastic target algorithms and censored running time observations. We show that our surrogate scenarios capture overall important characteristics of the original AC scenarios from which they were derived, while being much easier to use and orders of magnitude cheaper to evaluate.

AB - The optimization of algorithm (hyper-)parameters is crucial for achieving peak performance across a wide range of domains, ranging from deep neural networks to solvers for hard combinatorial problems. However, the proper evaluation of new algorithm configuration (AC) procedures (or configurators) is hindered by two key hurdles. First, AC scenarios are hard to set up, including the target algorithm to be optimized and the problem instances to be solved. Second, and even more significantly, they are computationally expensive: a single configurator run involves many costly runs of the target algorithm. Here, we propose a benchmarking approach that uses surrogate scenarios, which are computationally cheap while remaining close to the original AC scenarios. These surrogate scenarios approximate the response surface corresponding to true target algorithm performance using a regression model. In our experiments, we construct and evaluate surrogate scenarios for hyperparameter optimization as well as for AC problems that involve performance optimization of solvers for hard combinatorial problems. We generalize previous work by building surrogates for AC scenarios with multiple problem instances, stochastic target algorithms and censored running time observations. We show that our surrogate scenarios capture overall important characteristics of the original AC scenarios from which they were derived, while being much easier to use and orders of magnitude cheaper to evaluate.

KW - Algorithm configuration

KW - Empirical performance model

KW - Hyper-parameter optimization

UR - http://www.scopus.com/inward/record.url?scp=85038877814&partnerID=8YFLogxK

U2 - 10.1007/s10994-017-5683-z

DO - 10.1007/s10994-017-5683-z

M3 - Article

AN - SCOPUS:85038877814

VL - 107

SP - 15

EP - 41

JO - Machine learning

JF - Machine learning

SN - 0885-6125

IS - 1

ER -

Von denselben Autoren