Details
Originalsprache | Englisch |
---|---|
Seitenumfang | 18 |
Fachzeitschrift | Journal of Machine Learning Research |
Jahrgang | 21 |
Publikationsstatus | Veröffentlicht - Nov. 2020 |
Abstract
Finding a well-performing architecture is often tedious for both deep learning practitioners and researchers, leading to tremendous interest in the automation of this task by means of neural architecture search (NAS). Although the community has made major strides in developing better NAS methods, the quality of scientific empirical evaluations in the young field of NAS is still lacking behind that of other areas of machine learning. To address this issue, we describe a set of possible issues and ways to avoid them, leading to the NAS best practices checklist available at http://automl.org/nas_checklist.pdf.
ASJC Scopus Sachgebiete
- Informatik (insg.)
- Software
- Informatik (insg.)
- Artificial intelligence
- Ingenieurwesen (insg.)
- Steuerungs- und Systemtechnik
- Mathematik (insg.)
- Statistik und Wahrscheinlichkeit
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: Journal of Machine Learning Research, Jahrgang 21, 11.2020.
Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
}
TY - JOUR
T1 - Best Practices for Scientific Research on Neural Architecture Search
AU - Lindauer, Marius
AU - Hutter, Frank
PY - 2020/11
Y1 - 2020/11
N2 - Finding a well-performing architecture is often tedious for both deep learning practitioners and researchers, leading to tremendous interest in the automation of this task by means of neural architecture search (NAS). Although the community has made major strides in developing better NAS methods, the quality of scientific empirical evaluations in the young field of NAS is still lacking behind that of other areas of machine learning. To address this issue, we describe a set of possible issues and ways to avoid them, leading to the NAS best practices checklist available at http://automl.org/nas_checklist.pdf.
AB - Finding a well-performing architecture is often tedious for both deep learning practitioners and researchers, leading to tremendous interest in the automation of this task by means of neural architecture search (NAS). Although the community has made major strides in developing better NAS methods, the quality of scientific empirical evaluations in the young field of NAS is still lacking behind that of other areas of machine learning. To address this issue, we describe a set of possible issues and ways to avoid them, leading to the NAS best practices checklist available at http://automl.org/nas_checklist.pdf.
KW - cs.LG
KW - stat.ML
KW - Neural Architecture Search
KW - Scientific Best Practices
KW - Empirical Evaluation
UR - http://www.scopus.com/inward/record.url?scp=85098463247&partnerID=8YFLogxK
M3 - Article
VL - 21
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
SN - 1532-4435
ER -