Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | Machine Learning and Knowledge Discovery in Databases. Research Track |
Untertitel | European Conference, ECML PKDD 2021, Proceedings |
Herausgeber/-innen | Nuria Oliver, Fernando Pérez-Cruz, Stefan Kramer, Jesse Read, Jose A. Lozano |
Erscheinungsort | Cham |
Herausgeber (Verlag) | Springer Nature Switzerland AG |
Seiten | 265-296 |
Seitenumfang | 32 |
Band | 3 |
ISBN (elektronisch) | 978-3-030-86523-8 |
ISBN (Print) | 9783030865221 |
Publikationsstatus | Veröffentlicht - 2021 |
Veranstaltung | European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2021 - Bilbao, Spanien Dauer: 13 Sept. 2021 → 17 Sept. 2021 |
Publikationsreihe
Name | Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) |
---|---|
Band | 12977 |
ISSN (Print) | 0302-9743 |
ISSN (elektronisch) | 1611-3349 |
Abstract
While Bayesian Optimization (BO) is a very popular method for optimizing expensive black-box functions, it fails to leverage the experience of domain experts. This causes BO to waste function evaluations on bad design choices (e.g., machine learning hyperparameters) that the expert already knows to work poorly. To address this issue, we introduce Bayesian Optimization with a Prior for the Optimum (BOPrO). BOPrO allows users to inject their knowledge into the optimization process in the form of priors about which parts of the input space will yield the best performance, rather than BO’s standard priors over functions, which are much less intuitive for users. BOPrO then combines these priors with BO’s standard probabilistic model to form a pseudo-posterior used to select which points to evaluate next. We show that BOPrO is around 6.67 × faster than state-of-the-art methods on a common suite of benchmarks, and achieves a new state-of-the-art performance on a real-world hardware design application. We also show that BOPrO converges faster even if the priors for the optimum are not entirely accurate and that it robustly recovers from misleading priors.
ASJC Scopus Sachgebiete
- Mathematik (insg.)
- Theoretische Informatik
- Informatik (insg.)
- Allgemeine Computerwissenschaft
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
Machine Learning and Knowledge Discovery in Databases. Research Track: European Conference, ECML PKDD 2021, Proceedings. Hrsg. / Nuria Oliver; Fernando Pérez-Cruz; Stefan Kramer; Jesse Read; Jose A. Lozano. Band 3 Cham: Springer Nature Switzerland AG, 2021. S. 265-296 (Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science); Band 12977).
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Bayesian Optimization with a Prior for the Optimum
AU - Souza, Artur
AU - Nardi, Luigi
AU - Oliveira, Leonardo B.
AU - Olukotun, Kunle
AU - Lindauer, Marius
AU - Hutter, Frank
N1 - Funding Information: and Kunle Olukotun were supported in part by affiliate members and other supporters of the Stanford DAWN project—Ant Financial, Facebook, Google, Intel, Microsoft, NEC, SAP, Teradata, and VMware. Luigi Nardi was also partially supported by the Wallenberg AI, Autonomous Systems and Software Program (WASP) funded by the Knut and Alice Wallenberg Foundation. Artur Souza and Leonardo B. Oliveira were supported by CAPES, CNPq, and FAPEMIG. Frank Hutter acknowledges support by the European Research Council (ERC) under the European Union Horizon 2020 research and innovation programme through grant no. 716721. The computations were also enabled by resources provided by the Swedish National Infrastructure for Computing (SNIC) at LUNARC partially funded by the Swedish Research Council through grant agreement no. 2018-05973.
PY - 2021
Y1 - 2021
N2 - While Bayesian Optimization (BO) is a very popular method for optimizing expensive black-box functions, it fails to leverage the experience of domain experts. This causes BO to waste function evaluations on bad design choices (e.g., machine learning hyperparameters) that the expert already knows to work poorly. To address this issue, we introduce Bayesian Optimization with a Prior for the Optimum (BOPrO). BOPrO allows users to inject their knowledge into the optimization process in the form of priors about which parts of the input space will yield the best performance, rather than BO’s standard priors over functions, which are much less intuitive for users. BOPrO then combines these priors with BO’s standard probabilistic model to form a pseudo-posterior used to select which points to evaluate next. We show that BOPrO is around 6.67 × faster than state-of-the-art methods on a common suite of benchmarks, and achieves a new state-of-the-art performance on a real-world hardware design application. We also show that BOPrO converges faster even if the priors for the optimum are not entirely accurate and that it robustly recovers from misleading priors.
AB - While Bayesian Optimization (BO) is a very popular method for optimizing expensive black-box functions, it fails to leverage the experience of domain experts. This causes BO to waste function evaluations on bad design choices (e.g., machine learning hyperparameters) that the expert already knows to work poorly. To address this issue, we introduce Bayesian Optimization with a Prior for the Optimum (BOPrO). BOPrO allows users to inject their knowledge into the optimization process in the form of priors about which parts of the input space will yield the best performance, rather than BO’s standard priors over functions, which are much less intuitive for users. BOPrO then combines these priors with BO’s standard probabilistic model to form a pseudo-posterior used to select which points to evaluate next. We show that BOPrO is around 6.67 × faster than state-of-the-art methods on a common suite of benchmarks, and achieves a new state-of-the-art performance on a real-world hardware design application. We also show that BOPrO converges faster even if the priors for the optimum are not entirely accurate and that it robustly recovers from misleading priors.
UR - http://www.scopus.com/inward/record.url?scp=85115712403&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-86523-8_17
DO - 10.1007/978-3-030-86523-8_17
M3 - Conference contribution
AN - SCOPUS:85115712403
SN - 9783030865221
VL - 3
T3 - Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
SP - 265
EP - 296
BT - Machine Learning and Knowledge Discovery in Databases. Research Track
A2 - Oliver, Nuria
A2 - Pérez-Cruz, Fernando
A2 - Kramer, Stefan
A2 - Read, Jesse
A2 - Lozano, Jose A.
PB - Springer Nature Switzerland AG
CY - Cham
T2 - European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2021
Y2 - 13 September 2021 through 17 September 2021
ER -