LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Christoph Hube
  • Besnik Fetahu
  • Ujwal Gadiraju

Research Organisations

View graph of relations

Details

Original languageEnglish
Title of host publication SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018
Subtitle of host publicationProceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018)
Pages78-82
Number of pages5
Publication statusPublished - 2018
EventSAD+CrowdBias 2018: 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper (SAD 2018) and 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (CrowdBias 2018) - Zürich, Switzerland
Duration: 5 Jul 20185 Jul 2018

Publication series

NameCEUR Workshop Proceedings
PublisherCEUR Workshop Proceedings
Volume2276
ISSN (Print)1613-0073

Abstract

Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.

ASJC Scopus subject areas

Cite this

LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. / Hube, Christoph; Fetahu, Besnik; Gadiraju, Ujwal.
SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. p. 78-82 (CEUR Workshop Proceedings; Vol. 2276).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Hube, C, Fetahu, B & Gadiraju, U 2018, LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. in SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). CEUR Workshop Proceedings, vol. 2276, pp. 78-82, SAD+CrowdBias 2018, Zürich, Switzerland, 5 Jul 2018. <https://ceur-ws.org/Vol-2276/paper9.pdf>
Hube, C., Fetahu, B., & Gadiraju, U. (2018). LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. In SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018) (pp. 78-82). (CEUR Workshop Proceedings; Vol. 2276). https://ceur-ws.org/Vol-2276/paper9.pdf
Hube C, Fetahu B, Gadiraju U. LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. In SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. p. 78-82. (CEUR Workshop Proceedings).
Hube, Christoph ; Fetahu, Besnik ; Gadiraju, Ujwal. / LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments. SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. pp. 78-82 (CEUR Workshop Proceedings).
Download
@inproceedings{f1885ef3b8e74033a69c97620d67ceca,
title = "LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments",
abstract = "Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker{\textquoteright}s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.",
author = "Christoph Hube and Besnik Fetahu and Ujwal Gadiraju",
note = "Funding information: Acknowledgments This work is funded by the ERC Advanced Grant ALEXANDRIA (grant no. 339233), DESIR (grant no. 31081), and H2020 AFEL project (grant no. 687916).; SAD+CrowdBias 2018 : 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper (SAD 2018) and 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (CrowdBias 2018) ; Conference date: 05-07-2018 Through 05-07-2018",
year = "2018",
language = "English",
series = "CEUR Workshop Proceedings",
publisher = "CEUR Workshop Proceedings",
pages = "78--82",
booktitle = "SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018",

}

Download

TY - GEN

T1 - LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments

AU - Hube, Christoph

AU - Fetahu, Besnik

AU - Gadiraju, Ujwal

N1 - Funding information: Acknowledgments This work is funded by the ERC Advanced Grant ALEXANDRIA (grant no. 339233), DESIR (grant no. 31081), and H2020 AFEL project (grant no. 687916).

PY - 2018

Y1 - 2018

N2 - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.

AB - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.

UR - http://www.scopus.com/inward/record.url?scp=85058941737&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85058941737

T3 - CEUR Workshop Proceedings

SP - 78

EP - 82

BT - SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018

T2 - SAD+CrowdBias 2018

Y2 - 5 July 2018 through 5 July 2018

ER -