“It might be this, it should be that…” uncertainty and doubt in day-to-day research practice

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Jutta Schickore
  • Nora Hangel

External Research Organisations

  • Indiana University Bloomington
  • University of Konstanz
View graph of relations

Details

Original languageEnglish
Article number31
JournalEuropean Journal for Philosophy of Science
Volume9
Issue number2
Publication statusPublished - 26 Mar 2019
Externally publishedYes

Abstract

This paper examines how scientists conceptualize their research methodologies. Do scientists raise concerns about vague criteria and genuine uncertainties in experimental practice? If so, what sorts of issues do they identify as problematic? Do scientists acknowledge the presence of value judgments in scientific research, and do they reflect on the relation between epistemic and non-epistemic criteria for decisionmaking? We present findings from an analysis of qualitative interviews with 63 scientific researchers who talk about their views on good research practice. We argue that analysts of science should care about scientists’ conceptualizations of the criteria and of the practical judgments that scientific inquiry involves. While scientists’ accounts of their own research methodologies alone do not give us a full picture of how science really works, they can point us to areas of concern. They can inspire and direct philosophical reflections about how science works. Throughout the interviews, the participating researchers provided specific examples from their own research contexts as illustrations of their methodological points. These examples reveal that scientists often struggle to evaluate the quality of their data, to figure out whether the available evidence confirms their hypothesis, whether a replication was successful, or to what extent they can rely on peer-reviewed papers. General ideas about good research methods do not directly translate into specific evaluation criteria or strategies that can guide research and help validate empirical data.

Keywords

    Empirical science studies, Scientists’ methodologies, Uncertainty

ASJC Scopus subject areas

Cite this

“It might be this, it should be that…” uncertainty and doubt in day-to-day research practice. / Schickore, Jutta; Hangel, Nora.
In: European Journal for Philosophy of Science, Vol. 9, No. 2, 31, 26.03.2019.

Research output: Contribution to journalArticleResearchpeer review

Download
@article{9486f56adfa449b3a6d80a6ec5de8021,
title = "“It might be this, it should be that…” uncertainty and doubt in day-to-day research practice",
abstract = "This paper examines how scientists conceptualize their research methodologies. Do scientists raise concerns about vague criteria and genuine uncertainties in experimental practice? If so, what sorts of issues do they identify as problematic? Do scientists acknowledge the presence of value judgments in scientific research, and do they reflect on the relation between epistemic and non-epistemic criteria for decisionmaking? We present findings from an analysis of qualitative interviews with 63 scientific researchers who talk about their views on good research practice. We argue that analysts of science should care about scientists{\textquoteright} conceptualizations of the criteria and of the practical judgments that scientific inquiry involves. While scientists{\textquoteright} accounts of their own research methodologies alone do not give us a full picture of how science really works, they can point us to areas of concern. They can inspire and direct philosophical reflections about how science works. Throughout the interviews, the participating researchers provided specific examples from their own research contexts as illustrations of their methodological points. These examples reveal that scientists often struggle to evaluate the quality of their data, to figure out whether the available evidence confirms their hypothesis, whether a replication was successful, or to what extent they can rely on peer-reviewed papers. General ideas about good research methods do not directly translate into specific evaluation criteria or strategies that can guide research and help validate empirical data.",
keywords = "Empirical science studies, Scientists{\textquoteright} methodologies, Uncertainty",
author = "Jutta Schickore and Nora Hangel",
note = "Funding Information: Acknowledgments This study is part of a project funded by the NSF (Grant # SES-1534628, PI Jutta Schickore). Schickore produced the account presented in this paper while she was a member of the Institute for Advanced Study (Princeton) and gratefully acknowledges funding through the IAS. Early versions of this paper were presented at the VMST conference in Dallas (spring 2017), the LEAHPS workshop in Pittsburgh (spring 2018), and the SPSP conference in Ghent (summer 2018). We thank the audiences as well as Mike O{\textquoteright}Rourke and the anonymous reviewers for European Journal for Philosophy of Science for helpful comments and suggestions. Christoph Hoffmann and Klodian Coko also offered valuable feedback. We thank our research assistants Alvaro Michael and Catharine Xu for help with the re-organization and coding of the interviews. We acknowledge Diana Schmidt-Pfister, who developed the original project, BScientific Integrity in the context of Integration and Competition^ funded by the German Research Foundation (DFG 751/11, 751/11 and 472/16) and subsequently collaborated with Hangel at the Center of Excellence 16 BCultural Foundations of Integration^ at the University of Konstanz. We are also grateful to research assistants at Konstanz who transcribed and helped code the interviews. ",
year = "2019",
month = mar,
day = "26",
doi = "10.1007/s13194-019-0253-9",
language = "English",
volume = "9",
journal = "European Journal for Philosophy of Science",
issn = "1879-4912",
publisher = "Springer Netherlands",
number = "2",

}

Download

TY - JOUR

T1 - “It might be this, it should be that…” uncertainty and doubt in day-to-day research practice

AU - Schickore, Jutta

AU - Hangel, Nora

N1 - Funding Information: Acknowledgments This study is part of a project funded by the NSF (Grant # SES-1534628, PI Jutta Schickore). Schickore produced the account presented in this paper while she was a member of the Institute for Advanced Study (Princeton) and gratefully acknowledges funding through the IAS. Early versions of this paper were presented at the VMST conference in Dallas (spring 2017), the LEAHPS workshop in Pittsburgh (spring 2018), and the SPSP conference in Ghent (summer 2018). We thank the audiences as well as Mike O’Rourke and the anonymous reviewers for European Journal for Philosophy of Science for helpful comments and suggestions. Christoph Hoffmann and Klodian Coko also offered valuable feedback. We thank our research assistants Alvaro Michael and Catharine Xu for help with the re-organization and coding of the interviews. We acknowledge Diana Schmidt-Pfister, who developed the original project, BScientific Integrity in the context of Integration and Competition^ funded by the German Research Foundation (DFG 751/11, 751/11 and 472/16) and subsequently collaborated with Hangel at the Center of Excellence 16 BCultural Foundations of Integration^ at the University of Konstanz. We are also grateful to research assistants at Konstanz who transcribed and helped code the interviews.

PY - 2019/3/26

Y1 - 2019/3/26

N2 - This paper examines how scientists conceptualize their research methodologies. Do scientists raise concerns about vague criteria and genuine uncertainties in experimental practice? If so, what sorts of issues do they identify as problematic? Do scientists acknowledge the presence of value judgments in scientific research, and do they reflect on the relation between epistemic and non-epistemic criteria for decisionmaking? We present findings from an analysis of qualitative interviews with 63 scientific researchers who talk about their views on good research practice. We argue that analysts of science should care about scientists’ conceptualizations of the criteria and of the practical judgments that scientific inquiry involves. While scientists’ accounts of their own research methodologies alone do not give us a full picture of how science really works, they can point us to areas of concern. They can inspire and direct philosophical reflections about how science works. Throughout the interviews, the participating researchers provided specific examples from their own research contexts as illustrations of their methodological points. These examples reveal that scientists often struggle to evaluate the quality of their data, to figure out whether the available evidence confirms their hypothesis, whether a replication was successful, or to what extent they can rely on peer-reviewed papers. General ideas about good research methods do not directly translate into specific evaluation criteria or strategies that can guide research and help validate empirical data.

AB - This paper examines how scientists conceptualize their research methodologies. Do scientists raise concerns about vague criteria and genuine uncertainties in experimental practice? If so, what sorts of issues do they identify as problematic? Do scientists acknowledge the presence of value judgments in scientific research, and do they reflect on the relation between epistemic and non-epistemic criteria for decisionmaking? We present findings from an analysis of qualitative interviews with 63 scientific researchers who talk about their views on good research practice. We argue that analysts of science should care about scientists’ conceptualizations of the criteria and of the practical judgments that scientific inquiry involves. While scientists’ accounts of their own research methodologies alone do not give us a full picture of how science really works, they can point us to areas of concern. They can inspire and direct philosophical reflections about how science works. Throughout the interviews, the participating researchers provided specific examples from their own research contexts as illustrations of their methodological points. These examples reveal that scientists often struggle to evaluate the quality of their data, to figure out whether the available evidence confirms their hypothesis, whether a replication was successful, or to what extent they can rely on peer-reviewed papers. General ideas about good research methods do not directly translate into specific evaluation criteria or strategies that can guide research and help validate empirical data.

KW - Empirical science studies

KW - Scientists’ methodologies

KW - Uncertainty

UR - http://www.scopus.com/inward/record.url?scp=85063591700&partnerID=8YFLogxK

U2 - 10.1007/s13194-019-0253-9

DO - 10.1007/s13194-019-0253-9

M3 - Article

AN - SCOPUS:85063591700

VL - 9

JO - European Journal for Philosophy of Science

JF - European Journal for Philosophy of Science

SN - 1879-4912

IS - 2

M1 - 31

ER -