Automated Dynamic Algorithm Configuration

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

Externe Organisationen

  • Albert-Ludwigs-Universität Freiburg
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten (von - bis)1633-1699
Seitenumfang67
FachzeitschriftJournal of Artificial Intelligence Research
Jahrgang75
PublikationsstatusVeröffentlicht - Dez. 2022

Abstract

The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution, e.g., to adapt to the current part of the optimization landscape. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior-art to tackle this problem; (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.

ASJC Scopus Sachgebiete

Zitieren

Automated Dynamic Algorithm Configuration. / Adriaensen, Steven; Biedenkapp, André; Shala, Gresa et al.
in: Journal of Artificial Intelligence Research, Jahrgang 75, 12.2022, S. 1633-1699.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Adriaensen S, Biedenkapp A, Shala G, Awad N, Eimer T, Lindauer M et al. Automated Dynamic Algorithm Configuration. Journal of Artificial Intelligence Research. 2022 Dez;75:1633-1699. doi: 10.48550/arXiv.2205.13881, 10.1613/jair.1.13922
Adriaensen, Steven ; Biedenkapp, André ; Shala, Gresa et al. / Automated Dynamic Algorithm Configuration. in: Journal of Artificial Intelligence Research. 2022 ; Jahrgang 75. S. 1633-1699.
Download
@article{2872944e4e864ceeb0fb4ba913b231b8,
title = "Automated Dynamic Algorithm Configuration",
abstract = "The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior art to tackle this problem; and (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.",
keywords = "cs.AI, cs.LG, cs.NE",
author = "Steven Adriaensen and Andr{\'e} Biedenkapp and Gresa Shala and Noor Awad and Theresa Eimer and Marius Lindauer and Frank Hutter",
note = "Funding Information: All authors acknowledge funding by the Robert Bosch GmbH. Theresa Eimer and Marius Lindauer acknowledge funding by the German Research Foundation (DFG) under LI 2801/4-1. We thank Maximilian Reimer, Rishan Senanayake, G{\"o}ktuǧ Karaka{\c s}li, Nguyen Dang, Diederick Vermetten, Jacob de Nobel and Carolin Benjamins for their contributions to DACBench, and Carola Doerr for the many discussions on related work and problem formulation. ",
year = "2022",
month = dec,
doi = "10.48550/arXiv.2205.13881",
language = "English",
volume = "75",
pages = "1633--1699",
journal = "Journal of Artificial Intelligence Research",
issn = "1076-9757",
publisher = "Morgan Kaufmann Publishers, Inc.",

}

Download

TY - JOUR

T1 - Automated Dynamic Algorithm Configuration

AU - Adriaensen, Steven

AU - Biedenkapp, André

AU - Shala, Gresa

AU - Awad, Noor

AU - Eimer, Theresa

AU - Lindauer, Marius

AU - Hutter, Frank

N1 - Funding Information: All authors acknowledge funding by the Robert Bosch GmbH. Theresa Eimer and Marius Lindauer acknowledge funding by the German Research Foundation (DFG) under LI 2801/4-1. We thank Maximilian Reimer, Rishan Senanayake, Göktuǧ Karakaşli, Nguyen Dang, Diederick Vermetten, Jacob de Nobel and Carolin Benjamins for their contributions to DACBench, and Carola Doerr for the many discussions on related work and problem formulation.

PY - 2022/12

Y1 - 2022/12

N2 - The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior art to tackle this problem; and (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.

AB - The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior art to tackle this problem; and (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.

KW - cs.AI

KW - cs.LG

KW - cs.NE

UR - http://www.scopus.com/inward/record.url?scp=85148436940&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2205.13881

DO - 10.48550/arXiv.2205.13881

M3 - Article

VL - 75

SP - 1633

EP - 1699

JO - Journal of Artificial Intelligence Research

JF - Journal of Artificial Intelligence Research

SN - 1076-9757

ER -

Von denselben Autoren