The dataref versuchung: Saving Time through Better Internal Repeatability

Research output: Contribution to journalConference articleResearchpeer review

Authors

  • Christian Dietrich
  • Daniel Lohmann

External Research Organisations

  • Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU Erlangen-Nürnberg)
View graph of relations

Details

Original languageEnglish
Pages (from-to)51-60
Number of pages10
JournalOperating Systems Review (ACM)
Volume49
Issue number1
Publication statusPublished - 20 Jan 2015
Externally publishedYes
Event8th Workshop on Large-Scale Distributed Systems and Middleware, LADIS 2014 - Cambridge, United Kingdom (UK)
Duration: 23 Oct 201424 Oct 2014

Abstract

Compared to more traditional disciplines, such as the natural sciences, computer science is said to have a somewhat sloppy relationship with the external repeatability of published results. However, from our experience the problem starts even earlier: In many cases, authors are not even able to replicate their own results a year later, or to explain how exactly that number on page three of the paper was computed. Because of constant time pressure and strict submission deadlines, the successful researcher has to favor timely results over experiment documentation and data traceability. We consider internal repeatability to be one of the most important prerequisites for external replicability and the scientific process. We describe our approach to foster internal repeatability in our own research projects with the help of dedicated tools for the automation of traceable experimental setups and for data presentation in scientific papers. By employing these tools, measures for ensuring internal repeatability no longer waste valuable working time and pay off quickly: They save time by eliminating recurring, and therefore error-prone, manual work steps, and at the same time increase confidence in experimental results.

ASJC Scopus subject areas

Cite this

The dataref versuchung: Saving Time through Better Internal Repeatability. / Dietrich, Christian; Lohmann, Daniel.
In: Operating Systems Review (ACM), Vol. 49, No. 1, 20.01.2015, p. 51-60.

Research output: Contribution to journalConference articleResearchpeer review

Dietrich, C & Lohmann, D 2015, 'The dataref versuchung: Saving Time through Better Internal Repeatability', Operating Systems Review (ACM), vol. 49, no. 1, pp. 51-60. https://doi.org/10.1145/2723872.2723880
Dietrich, C., & Lohmann, D. (2015). The dataref versuchung: Saving Time through Better Internal Repeatability. Operating Systems Review (ACM), 49(1), 51-60. https://doi.org/10.1145/2723872.2723880
Dietrich C, Lohmann D. The dataref versuchung: Saving Time through Better Internal Repeatability. Operating Systems Review (ACM). 2015 Jan 20;49(1):51-60. doi: 10.1145/2723872.2723880
Dietrich, Christian ; Lohmann, Daniel. / The dataref versuchung : Saving Time through Better Internal Repeatability. In: Operating Systems Review (ACM). 2015 ; Vol. 49, No. 1. pp. 51-60.
Download
@article{ed632c8080a54117b32d9bbe667b5e90,
title = "The dataref versuchung∗: Saving Time through Better Internal Repeatability",
abstract = "Compared to more traditional disciplines, such as the natural sciences, computer science is said to have a somewhat sloppy relationship with the external repeatability of published results. However, from our experience the problem starts even earlier: In many cases, authors are not even able to replicate their own results a year later, or to explain how exactly that number on page three of the paper was computed. Because of constant time pressure and strict submission deadlines, the successful researcher has to favor timely results over experiment documentation and data traceability. We consider internal repeatability to be one of the most important prerequisites for external replicability and the scientific process. We describe our approach to foster internal repeatability in our own research projects with the help of dedicated tools for the automation of traceable experimental setups and for data presentation in scientific papers. By employing these tools, measures for ensuring internal repeatability no longer waste valuable working time and pay off quickly: They save time by eliminating recurring, and therefore error-prone, manual work steps, and at the same time increase confidence in experimental results.",
author = "Christian Dietrich and Daniel Lohmann",
year = "2015",
month = jan,
day = "20",
doi = "10.1145/2723872.2723880",
language = "English",
volume = "49",
pages = "51--60",
number = "1",
note = "8th Workshop on Large-Scale Distributed Systems and Middleware, LADIS 2014 ; Conference date: 23-10-2014 Through 24-10-2014",

}

Download

TY - JOUR

T1 - The dataref versuchung∗

T2 - 8th Workshop on Large-Scale Distributed Systems and Middleware, LADIS 2014

AU - Dietrich, Christian

AU - Lohmann, Daniel

PY - 2015/1/20

Y1 - 2015/1/20

N2 - Compared to more traditional disciplines, such as the natural sciences, computer science is said to have a somewhat sloppy relationship with the external repeatability of published results. However, from our experience the problem starts even earlier: In many cases, authors are not even able to replicate their own results a year later, or to explain how exactly that number on page three of the paper was computed. Because of constant time pressure and strict submission deadlines, the successful researcher has to favor timely results over experiment documentation and data traceability. We consider internal repeatability to be one of the most important prerequisites for external replicability and the scientific process. We describe our approach to foster internal repeatability in our own research projects with the help of dedicated tools for the automation of traceable experimental setups and for data presentation in scientific papers. By employing these tools, measures for ensuring internal repeatability no longer waste valuable working time and pay off quickly: They save time by eliminating recurring, and therefore error-prone, manual work steps, and at the same time increase confidence in experimental results.

AB - Compared to more traditional disciplines, such as the natural sciences, computer science is said to have a somewhat sloppy relationship with the external repeatability of published results. However, from our experience the problem starts even earlier: In many cases, authors are not even able to replicate their own results a year later, or to explain how exactly that number on page three of the paper was computed. Because of constant time pressure and strict submission deadlines, the successful researcher has to favor timely results over experiment documentation and data traceability. We consider internal repeatability to be one of the most important prerequisites for external replicability and the scientific process. We describe our approach to foster internal repeatability in our own research projects with the help of dedicated tools for the automation of traceable experimental setups and for data presentation in scientific papers. By employing these tools, measures for ensuring internal repeatability no longer waste valuable working time and pay off quickly: They save time by eliminating recurring, and therefore error-prone, manual work steps, and at the same time increase confidence in experimental results.

UR - http://www.scopus.com/inward/record.url?scp=84955300775&partnerID=8YFLogxK

U2 - 10.1145/2723872.2723880

DO - 10.1145/2723872.2723880

M3 - Conference article

AN - SCOPUS:84955300775

VL - 49

SP - 51

EP - 60

JO - Operating Systems Review (ACM)

JF - Operating Systems Review (ACM)

SN - 0163-5980

IS - 1

Y2 - 23 October 2014 through 24 October 2014

ER -