Crowd worker strategies in relevance judgment tasks

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Lei Han
  • Eddy Maddalena
  • Alessandro Checco
  • Cristina Sarasua
  • Ujwal Gadiraju
  • Kevin Roitero
  • Gianluca Demartini

Organisationseinheiten

Externe Organisationen

  • University of Queensland
  • University of Southampton
  • The University of Sheffield
  • Universität Zürich (UZH)
  • University of Udine
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksWSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining
Seiten241-249
Seitenumfang9
ISBN (elektronisch)9781450368223
PublikationsstatusVeröffentlicht - 20 Jan. 2020
Veranstaltung13th ACM International Conference on Web Search and Data Mining, WSDM 2020 - Houston, USA / Vereinigte Staaten
Dauer: 3 Feb. 20207 Feb. 2020

Publikationsreihe

NameWSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining

Abstract

Crowdsourcing is a popular technique to collect large amounts of human-generated labels, such as relevance judgments used to create information retrieval (IR) evaluation collections. Previous research has shown how collecting high quality labels from a crowdsourcing platform can be challenging. Existing quality assurance techniques focus on answer aggregation or on the use of gold questions where ground-truth data allows to check for the quality of the responses. In this paper, we present qualitative and quantitative results, revealing how different crowd workers adopt different work strategies to complete relevance judgment tasks efficiently and their consequent impact on quality. We delve into the techniques and tools that highly experienced crowd workers use to be more efficient in completing crowdsourcing micro-tasks. To this end, we use both qualitative results from worker interviews and surveys, as well as the results of a data-driven study of behavioral log data (i.e., clicks, keystrokes and keyboard shortcuts) collected from crowd workers performing relevance judgment tasks. Our results highlight the presence of frequently used shortcut patterns that can speed-up task completion, thus increasing the hourly wage of efficient workers. We observe how crowd work experiences result in different types of working strategies, productivity levels, quality and diversity of the crowdsourced judgments.

ASJC Scopus Sachgebiete

Zitieren

Crowd worker strategies in relevance judgment tasks. / Han, Lei; Maddalena, Eddy; Checco, Alessandro et al.
WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining. 2020. S. 241-249 (WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Han, L, Maddalena, E, Checco, A, Sarasua, C, Gadiraju, U, Roitero, K & Demartini, G 2020, Crowd worker strategies in relevance judgment tasks. in WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining. WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining, S. 241-249, 13th ACM International Conference on Web Search and Data Mining, WSDM 2020, Houston, USA / Vereinigte Staaten, 3 Feb. 2020. https://doi.org/10.1145/3336191.3371857
Han, L., Maddalena, E., Checco, A., Sarasua, C., Gadiraju, U., Roitero, K., & Demartini, G. (2020). Crowd worker strategies in relevance judgment tasks. In WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining (S. 241-249). (WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining). https://doi.org/10.1145/3336191.3371857
Han L, Maddalena E, Checco A, Sarasua C, Gadiraju U, Roitero K et al. Crowd worker strategies in relevance judgment tasks. in WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining. 2020. S. 241-249. (WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining). doi: 10.1145/3336191.3371857
Han, Lei ; Maddalena, Eddy ; Checco, Alessandro et al. / Crowd worker strategies in relevance judgment tasks. WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining. 2020. S. 241-249 (WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining).
Download
@inproceedings{cd147c4f2da447c49f60e1a44afc9651,
title = "Crowd worker strategies in relevance judgment tasks",
abstract = "Crowdsourcing is a popular technique to collect large amounts of human-generated labels, such as relevance judgments used to create information retrieval (IR) evaluation collections. Previous research has shown how collecting high quality labels from a crowdsourcing platform can be challenging. Existing quality assurance techniques focus on answer aggregation or on the use of gold questions where ground-truth data allows to check for the quality of the responses. In this paper, we present qualitative and quantitative results, revealing how different crowd workers adopt different work strategies to complete relevance judgment tasks efficiently and their consequent impact on quality. We delve into the techniques and tools that highly experienced crowd workers use to be more efficient in completing crowdsourcing micro-tasks. To this end, we use both qualitative results from worker interviews and surveys, as well as the results of a data-driven study of behavioral log data (i.e., clicks, keystrokes and keyboard shortcuts) collected from crowd workers performing relevance judgment tasks. Our results highlight the presence of frequently used shortcut patterns that can speed-up task completion, thus increasing the hourly wage of efficient workers. We observe how crowd work experiences result in different types of working strategies, productivity levels, quality and diversity of the crowdsourced judgments.",
keywords = "Crowdsourcing, IR evaluation, Relevance judgment, User behavior",
author = "Lei Han and Eddy Maddalena and Alessandro Checco and Cristina Sarasua and Ujwal Gadiraju and Kevin Roitero and Gianluca Demartini",
note = "Funding Information: Acknowledgements. This work is supported by ARC Discovery Project (DP190102141) and the Erasmus+ project DISKOW (60171990).; 13th ACM International Conference on Web Search and Data Mining, WSDM 2020 ; Conference date: 03-02-2020 Through 07-02-2020",
year = "2020",
month = jan,
day = "20",
doi = "10.1145/3336191.3371857",
language = "English",
series = "WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining",
pages = "241--249",
booktitle = "WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining",

}

Download

TY - GEN

T1 - Crowd worker strategies in relevance judgment tasks

AU - Han, Lei

AU - Maddalena, Eddy

AU - Checco, Alessandro

AU - Sarasua, Cristina

AU - Gadiraju, Ujwal

AU - Roitero, Kevin

AU - Demartini, Gianluca

N1 - Funding Information: Acknowledgements. This work is supported by ARC Discovery Project (DP190102141) and the Erasmus+ project DISKOW (60171990).

PY - 2020/1/20

Y1 - 2020/1/20

N2 - Crowdsourcing is a popular technique to collect large amounts of human-generated labels, such as relevance judgments used to create information retrieval (IR) evaluation collections. Previous research has shown how collecting high quality labels from a crowdsourcing platform can be challenging. Existing quality assurance techniques focus on answer aggregation or on the use of gold questions where ground-truth data allows to check for the quality of the responses. In this paper, we present qualitative and quantitative results, revealing how different crowd workers adopt different work strategies to complete relevance judgment tasks efficiently and their consequent impact on quality. We delve into the techniques and tools that highly experienced crowd workers use to be more efficient in completing crowdsourcing micro-tasks. To this end, we use both qualitative results from worker interviews and surveys, as well as the results of a data-driven study of behavioral log data (i.e., clicks, keystrokes and keyboard shortcuts) collected from crowd workers performing relevance judgment tasks. Our results highlight the presence of frequently used shortcut patterns that can speed-up task completion, thus increasing the hourly wage of efficient workers. We observe how crowd work experiences result in different types of working strategies, productivity levels, quality and diversity of the crowdsourced judgments.

AB - Crowdsourcing is a popular technique to collect large amounts of human-generated labels, such as relevance judgments used to create information retrieval (IR) evaluation collections. Previous research has shown how collecting high quality labels from a crowdsourcing platform can be challenging. Existing quality assurance techniques focus on answer aggregation or on the use of gold questions where ground-truth data allows to check for the quality of the responses. In this paper, we present qualitative and quantitative results, revealing how different crowd workers adopt different work strategies to complete relevance judgment tasks efficiently and their consequent impact on quality. We delve into the techniques and tools that highly experienced crowd workers use to be more efficient in completing crowdsourcing micro-tasks. To this end, we use both qualitative results from worker interviews and surveys, as well as the results of a data-driven study of behavioral log data (i.e., clicks, keystrokes and keyboard shortcuts) collected from crowd workers performing relevance judgment tasks. Our results highlight the presence of frequently used shortcut patterns that can speed-up task completion, thus increasing the hourly wage of efficient workers. We observe how crowd work experiences result in different types of working strategies, productivity levels, quality and diversity of the crowdsourced judgments.

KW - Crowdsourcing

KW - IR evaluation

KW - Relevance judgment

KW - User behavior

UR - http://www.scopus.com/inward/record.url?scp=85079523647&partnerID=8YFLogxK

U2 - 10.1145/3336191.3371857

DO - 10.1145/3336191.3371857

M3 - Conference contribution

AN - SCOPUS:85079523647

T3 - WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining

SP - 241

EP - 249

BT - WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining

T2 - 13th ACM International Conference on Web Search and Data Mining, WSDM 2020

Y2 - 3 February 2020 through 7 February 2020

ER -