Details
Original language | English |
---|---|
Title of host publication | SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018 |
Subtitle of host publication | Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018) |
Pages | 78-82 |
Number of pages | 5 |
Publication status | Published - 2018 |
Event | SAD+CrowdBias 2018: 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper (SAD 2018) and 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (CrowdBias 2018) - Zürich, Switzerland Duration: 5 Jul 2018 → 5 Jul 2018 |
Publication series
Name | CEUR Workshop Proceedings |
---|---|
Publisher | CEUR Workshop Proceedings |
Volume | 2276 |
ISSN (Print) | 1613-0073 |
Abstract
Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.
ASJC Scopus subject areas
- Computer Science(all)
- General Computer Science
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. p. 78-82 (CEUR Workshop Proceedings; Vol. 2276).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments
AU - Hube, Christoph
AU - Fetahu, Besnik
AU - Gadiraju, Ujwal
N1 - Funding information: Acknowledgments This work is funded by the ERC Advanced Grant ALEXANDRIA (grant no. 339233), DESIR (grant no. 31081), and H2020 AFEL project (grant no. 687916).
PY - 2018
Y1 - 2018
N2 - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.
AB - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.
UR - http://www.scopus.com/inward/record.url?scp=85058941737&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85058941737
T3 - CEUR Workshop Proceedings
SP - 78
EP - 82
BT - SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018
T2 - SAD+CrowdBias 2018
Y2 - 5 July 2018 through 5 July 2018
ER -