Neural Networks for Predicting Algorithm Runtime Distributions

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Externe Organisationen

  • Albert-Ludwigs-Universität Freiburg
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksProceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Herausgeber/-innenJerome Lang
Seiten1442-1448
Seitenumfang7
ISBN (elektronisch)9780999241127
PublikationsstatusVeröffentlicht - 2018
Extern publiziertJa
Veranstaltung27th International Joint Conference on Artificial Intelligence, IJCAI 2018 - Stockholm, Schweden
Dauer: 13 Juli 201819 Juli 2018

Abstract

Many state-of-the-art algorithms for solving hard combinatorial problems in artificial intelligence (AI) include elements of stochasticity that lead to high variations in runtime, even for a fixed problem instance. Knowledge about the resulting runtime distributions (RTDs) of algorithms on given problem instances can be exploited in various meta-algorithmic procedures, such as algorithm selection, portfolios, and randomized restarts. Previous work has shown that machine learning can be used to individually predict mean, median and variance of RTDs. To establish a new state-of-the-art in predicting RTDs, we demonstrate that the parameters of an RTD should be learned jointly and that neural networks can do this well by directly optimizing the likelihood of an RTD given runtime observations. In an empirical study involving five algorithms for SAT solving and AI planning, we show that neural networks predict the true RTDs of unseen instances better than previous methods, and can even do so when only few runtime observations are available per training instance.

ASJC Scopus Sachgebiete

Zitieren

Neural Networks for Predicting Algorithm Runtime Distributions. / Eggensperger, Katharina; Lindauer, Marius; Hutter, Frank.
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence. Hrsg. / Jerome Lang. 2018. S. 1442-1448.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Eggensperger, K, Lindauer, M & Hutter, F 2018, Neural Networks for Predicting Algorithm Runtime Distributions. in J Lang (Hrsg.), Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence. S. 1442-1448, 27th International Joint Conference on Artificial Intelligence, IJCAI 2018, Stockholm, Schweden, 13 Juli 2018. https://doi.org/10.24963/ijcai.2018/200
Eggensperger, K., Lindauer, M., & Hutter, F. (2018). Neural Networks for Predicting Algorithm Runtime Distributions. In J. Lang (Hrsg.), Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence (S. 1442-1448) https://doi.org/10.24963/ijcai.2018/200
Eggensperger K, Lindauer M, Hutter F. Neural Networks for Predicting Algorithm Runtime Distributions. in Lang J, Hrsg., Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence. 2018. S. 1442-1448 doi: 10.24963/ijcai.2018/200
Eggensperger, Katharina ; Lindauer, Marius ; Hutter, Frank. / Neural Networks for Predicting Algorithm Runtime Distributions. Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence. Hrsg. / Jerome Lang. 2018. S. 1442-1448
Download
@inproceedings{b975df3f4ab64fefad06163b01b34ccd,
title = "Neural Networks for Predicting Algorithm Runtime Distributions",
abstract = "Many state-of-the-art algorithms for solving hard combinatorial problems in artificial intelligence (AI) include elements of stochasticity that lead to high variations in runtime, even for a fixed problem instance. Knowledge about the resulting runtime distributions (RTDs) of algorithms on given problem instances can be exploited in various meta-algorithmic procedures, such as algorithm selection, portfolios, and randomized restarts. Previous work has shown that machine learning can be used to individually predict mean, median and variance of RTDs. To establish a new state-of-the-art in predicting RTDs, we demonstrate that the parameters of an RTD should be learned jointly and that neural networks can do this well by directly optimizing the likelihood of an RTD given runtime observations. In an empirical study involving five algorithms for SAT solving and AI planning, we show that neural networks predict the true RTDs of unseen instances better than previous methods, and can even do so when only few runtime observations are available per training instance.",
author = "Katharina Eggensperger and Marius Lindauer and Frank Hutter",
note = "Funding information: The authors acknowledge funding by the German Research Foundation (DFG) under Emmy Noether grant HU 1900/2-1 and support by the state of Baden-W{\"u}rttemberg through bwHPC and through grant no INST 39/963-1 FUGG. K. Eggensperger additionally acknowledges funding by the State Graduate Funding Program of Baden-W{\"u}rttemberg.; 27th International Joint Conference on Artificial Intelligence, IJCAI 2018 ; Conference date: 13-07-2018 Through 19-07-2018",
year = "2018",
doi = "10.24963/ijcai.2018/200",
language = "English",
pages = "1442--1448",
editor = "Jerome Lang",
booktitle = "Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence",

}

Download

TY - GEN

T1 - Neural Networks for Predicting Algorithm Runtime Distributions

AU - Eggensperger, Katharina

AU - Lindauer, Marius

AU - Hutter, Frank

N1 - Funding information: The authors acknowledge funding by the German Research Foundation (DFG) under Emmy Noether grant HU 1900/2-1 and support by the state of Baden-Württemberg through bwHPC and through grant no INST 39/963-1 FUGG. K. Eggensperger additionally acknowledges funding by the State Graduate Funding Program of Baden-Württemberg.

PY - 2018

Y1 - 2018

N2 - Many state-of-the-art algorithms for solving hard combinatorial problems in artificial intelligence (AI) include elements of stochasticity that lead to high variations in runtime, even for a fixed problem instance. Knowledge about the resulting runtime distributions (RTDs) of algorithms on given problem instances can be exploited in various meta-algorithmic procedures, such as algorithm selection, portfolios, and randomized restarts. Previous work has shown that machine learning can be used to individually predict mean, median and variance of RTDs. To establish a new state-of-the-art in predicting RTDs, we demonstrate that the parameters of an RTD should be learned jointly and that neural networks can do this well by directly optimizing the likelihood of an RTD given runtime observations. In an empirical study involving five algorithms for SAT solving and AI planning, we show that neural networks predict the true RTDs of unseen instances better than previous methods, and can even do so when only few runtime observations are available per training instance.

AB - Many state-of-the-art algorithms for solving hard combinatorial problems in artificial intelligence (AI) include elements of stochasticity that lead to high variations in runtime, even for a fixed problem instance. Knowledge about the resulting runtime distributions (RTDs) of algorithms on given problem instances can be exploited in various meta-algorithmic procedures, such as algorithm selection, portfolios, and randomized restarts. Previous work has shown that machine learning can be used to individually predict mean, median and variance of RTDs. To establish a new state-of-the-art in predicting RTDs, we demonstrate that the parameters of an RTD should be learned jointly and that neural networks can do this well by directly optimizing the likelihood of an RTD given runtime observations. In an empirical study involving five algorithms for SAT solving and AI planning, we show that neural networks predict the true RTDs of unseen instances better than previous methods, and can even do so when only few runtime observations are available per training instance.

UR - http://www.scopus.com/inward/record.url?scp=85055674327&partnerID=8YFLogxK

U2 - 10.24963/ijcai.2018/200

DO - 10.24963/ijcai.2018/200

M3 - Conference contribution

AN - SCOPUS:85055674327

SP - 1442

EP - 1448

BT - Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence

A2 - Lang, Jerome

T2 - 27th International Joint Conference on Artificial Intelligence, IJCAI 2018

Y2 - 13 July 2018 through 19 July 2018

ER -

Von denselben Autoren