Details
Original language | English |
---|---|
Title of host publication | Discovery Science - 23rd International Conference, DS 2020, Proceedings |
Editors | Annalisa Appice, Grigorios Tsoumakas, Yannis Manolopoulos, Stan Matwin |
Publisher | Springer Science and Business Media Deutschland GmbH |
Pages | 309-324 |
Number of pages | 16 |
ISBN (print) | 9783030615260 |
Publication status | Published - 2020 |
Externally published | Yes |
Event | 23rd International Conference on Discovery Science, DS 2020 - Thessaloniki, Greece Duration: 19 Oct 2020 → 21 Oct 2020 |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Volume | 12323 LNAI |
ISSN (Print) | 0302-9743 |
ISSN (electronic) | 1611-3349 |
Abstract
Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.
Keywords
- Dyadic ranking, Extreme algorithm selection, Surrogate model
ASJC Scopus subject areas
- Mathematics(all)
- Theoretical Computer Science
- Computer Science(all)
- General Computer Science
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
Discovery Science - 23rd International Conference, DS 2020, Proceedings. ed. / Annalisa Appice; Grigorios Tsoumakas; Yannis Manolopoulos; Stan Matwin. Springer Science and Business Media Deutschland GmbH, 2020. p. 309-324 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 12323 LNAI).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Extreme Algorithm Selection with Dyadic Feature Representation
AU - Tornede, Alexander
AU - Wever, Marcel
AU - Hüllermeier, Eyke
N1 - Publisher Copyright: © 2020, Springer Nature Switzerland AG.
PY - 2020
Y1 - 2020
N2 - Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.
AB - Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.
KW - Dyadic ranking
KW - Extreme algorithm selection
KW - Surrogate model
UR - http://www.scopus.com/inward/record.url?scp=85094096801&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-61527-7_21
DO - 10.1007/978-3-030-61527-7_21
M3 - Conference contribution
SN - 9783030615260
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 309
EP - 324
BT - Discovery Science - 23rd International Conference, DS 2020, Proceedings
A2 - Appice, Annalisa
A2 - Tsoumakas, Grigorios
A2 - Manolopoulos, Yannis
A2 - Matwin, Stan
PB - Springer Science and Business Media Deutschland GmbH
T2 - 23rd International Conference on Discovery Science, DS 2020
Y2 - 19 October 2020 through 21 October 2020
ER -