LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autorschaft

  • Christoph Hube
  • Besnik Fetahu
  • Ujwal Gadiraju

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des Sammelwerks SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018
UntertitelProceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018)
Seiten78-82
Seitenumfang5
PublikationsstatusVeröffentlicht - 2018
VeranstaltungSAD+CrowdBias 2018: 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper (SAD 2018) and 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (CrowdBias 2018) - Zürich, Schweiz
Dauer: 5 Juli 20185 Juli 2018

Publikationsreihe

NameCEUR Workshop Proceedings
Herausgeber (Verlag)CEUR Workshop Proceedings
Band2276
ISSN (Print)1613-0073

Abstract

Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.

ASJC Scopus Sachgebiete

Zitieren

LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. / Hube, Christoph; Fetahu, Besnik; Gadiraju, Ujwal.
SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. S. 78-82 (CEUR Workshop Proceedings; Band 2276).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Hube, C, Fetahu, B & Gadiraju, U 2018, LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. in SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). CEUR Workshop Proceedings, Bd. 2276, S. 78-82, SAD+CrowdBias 2018, Zürich, Schweiz, 5 Juli 2018. <https://ceur-ws.org/Vol-2276/paper9.pdf>
Hube, C., Fetahu, B., & Gadiraju, U. (2018). LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. In SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018) (S. 78-82). (CEUR Workshop Proceedings; Band 2276). https://ceur-ws.org/Vol-2276/paper9.pdf
Hube C, Fetahu B, Gadiraju U. LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. in SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. S. 78-82. (CEUR Workshop Proceedings).
Hube, Christoph ; Fetahu, Besnik ; Gadiraju, Ujwal. / LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. S. 78-82 (CEUR Workshop Proceedings).
Download
@inproceedings{f1885ef3b8e74033a69c97620d67ceca,
title = "LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments",
abstract = "Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker{\textquoteright}s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.",
author = "Christoph Hube and Besnik Fetahu and Ujwal Gadiraju",
note = "Funding information: Acknowledgments This work is funded by the ERC Advanced Grant ALEXANDRIA (grant no. 339233), DESIR (grant no. 31081), and H2020 AFEL project (grant no. 687916).; SAD+CrowdBias 2018 : 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper (SAD 2018) and 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (CrowdBias 2018) ; Conference date: 05-07-2018 Through 05-07-2018",
year = "2018",
language = "English",
series = "CEUR Workshop Proceedings",
publisher = "CEUR Workshop Proceedings",
pages = "78--82",
booktitle = "SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018",

}

Download

TY - GEN

T1 - LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments

AU - Hube, Christoph

AU - Fetahu, Besnik

AU - Gadiraju, Ujwal

N1 - Funding information: Acknowledgments This work is funded by the ERC Advanced Grant ALEXANDRIA (grant no. 339233), DESIR (grant no. 31081), and H2020 AFEL project (grant no. 687916).

PY - 2018

Y1 - 2018

N2 - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.

AB - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.

UR - http://www.scopus.com/inward/record.url?scp=85058941737&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85058941737

T3 - CEUR Workshop Proceedings

SP - 78

EP - 82

BT - SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018

T2 - SAD+CrowdBias 2018

Y2 - 5 July 2018 through 5 July 2018

ER -