Extreme Algorithm Selection with Dyadic Feature Representation

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

External Research Organisations

  • Heinz Nixdorf Institute
  • Paderborn University
View graph of relations

Details

Original languageEnglish
Title of host publicationDiscovery Science - 23rd International Conference, DS 2020, Proceedings
EditorsAnnalisa Appice, Grigorios Tsoumakas, Yannis Manolopoulos, Stan Matwin
PublisherSpringer Science and Business Media Deutschland GmbH
Pages309-324
Number of pages16
ISBN (print)9783030615260
Publication statusPublished - 2020
Externally publishedYes
Event23rd International Conference on Discovery Science, DS 2020 - Thessaloniki, Greece
Duration: 19 Oct 202021 Oct 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12323 LNAI
ISSN (Print)0302-9743
ISSN (electronic)1611-3349

Abstract

Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.

Keywords

    Dyadic ranking, Extreme algorithm selection, Surrogate model

ASJC Scopus subject areas

Cite this

Extreme Algorithm Selection with Dyadic Feature Representation. / Tornede, Alexander; Wever, Marcel; Hüllermeier, Eyke.
Discovery Science - 23rd International Conference, DS 2020, Proceedings. ed. / Annalisa Appice; Grigorios Tsoumakas; Yannis Manolopoulos; Stan Matwin. Springer Science and Business Media Deutschland GmbH, 2020. p. 309-324 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 12323 LNAI).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Tornede, A, Wever, M & Hüllermeier, E 2020, Extreme Algorithm Selection with Dyadic Feature Representation. in A Appice, G Tsoumakas, Y Manolopoulos & S Matwin (eds), Discovery Science - 23rd International Conference, DS 2020, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 12323 LNAI, Springer Science and Business Media Deutschland GmbH, pp. 309-324, 23rd International Conference on Discovery Science, DS 2020, Thessaloniki, Greece, 19 Oct 2020. https://doi.org/10.1007/978-3-030-61527-7_21
Tornede, A., Wever, M., & Hüllermeier, E. (2020). Extreme Algorithm Selection with Dyadic Feature Representation. In A. Appice, G. Tsoumakas, Y. Manolopoulos, & S. Matwin (Eds.), Discovery Science - 23rd International Conference, DS 2020, Proceedings (pp. 309-324). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 12323 LNAI). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-61527-7_21
Tornede A, Wever M, Hüllermeier E. Extreme Algorithm Selection with Dyadic Feature Representation. In Appice A, Tsoumakas G, Manolopoulos Y, Matwin S, editors, Discovery Science - 23rd International Conference, DS 2020, Proceedings. Springer Science and Business Media Deutschland GmbH. 2020. p. 309-324. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-030-61527-7_21
Tornede, Alexander ; Wever, Marcel ; Hüllermeier, Eyke. / Extreme Algorithm Selection with Dyadic Feature Representation. Discovery Science - 23rd International Conference, DS 2020, Proceedings. editor / Annalisa Appice ; Grigorios Tsoumakas ; Yannis Manolopoulos ; Stan Matwin. Springer Science and Business Media Deutschland GmbH, 2020. pp. 309-324 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Download
@inproceedings{5d7a6c6615914aa18824f5d836192990,
title = "Extreme Algorithm Selection with Dyadic Feature Representation",
abstract = "Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.",
keywords = "Dyadic ranking, Extreme algorithm selection, Surrogate model",
author = "Alexander Tornede and Marcel Wever and Eyke H{\"u}llermeier",
note = "Publisher Copyright: {\textcopyright} 2020, Springer Nature Switzerland AG.; 23rd International Conference on Discovery Science, DS 2020 ; Conference date: 19-10-2020 Through 21-10-2020",
year = "2020",
doi = "10.1007/978-3-030-61527-7_21",
language = "English",
isbn = "9783030615260",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Science and Business Media Deutschland GmbH",
pages = "309--324",
editor = "Annalisa Appice and Grigorios Tsoumakas and Yannis Manolopoulos and Stan Matwin",
booktitle = "Discovery Science - 23rd International Conference, DS 2020, Proceedings",
address = "Germany",

}

Download

TY - GEN

T1 - Extreme Algorithm Selection with Dyadic Feature Representation

AU - Tornede, Alexander

AU - Wever, Marcel

AU - Hüllermeier, Eyke

N1 - Publisher Copyright: © 2020, Springer Nature Switzerland AG.

PY - 2020

Y1 - 2020

N2 - Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.

AB - Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.

KW - Dyadic ranking

KW - Extreme algorithm selection

KW - Surrogate model

UR - http://www.scopus.com/inward/record.url?scp=85094096801&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-61527-7_21

DO - 10.1007/978-3-030-61527-7_21

M3 - Conference contribution

SN - 9783030615260

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 309

EP - 324

BT - Discovery Science - 23rd International Conference, DS 2020, Proceedings

A2 - Appice, Annalisa

A2 - Tsoumakas, Grigorios

A2 - Manolopoulos, Yannis

A2 - Matwin, Stan

PB - Springer Science and Business Media Deutschland GmbH

T2 - 23rd International Conference on Discovery Science, DS 2020

Y2 - 19 October 2020 through 21 October 2020

ER -

By the same author(s)