Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018 |
Untertitel | Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018) |
Seiten | 78-82 |
Seitenumfang | 5 |
Publikationsstatus | Veröffentlicht - 2018 |
Veranstaltung | SAD+CrowdBias 2018: 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper (SAD 2018) and 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (CrowdBias 2018) - Zürich, Schweiz Dauer: 5 Juli 2018 → 5 Juli 2018 |
Publikationsreihe
Name | CEUR Workshop Proceedings |
---|---|
Herausgeber (Verlag) | CEUR Workshop Proceedings |
Band | 2276 |
ISSN (Print) | 1613-0073 |
Abstract
Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.
ASJC Scopus Sachgebiete
- Informatik (insg.)
- Allgemeine Computerwissenschaft
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018: Proceedings of the 1st Workshop on Subjectivity, Ambiguity and Disagreement in Crowdsourcing, and Short Paper Proceedings of the 1st Workshop on Disentangling the Relation Between Crowdsourcing and Bias Management (SAD 2018 and CrowdBias 2018) co-located the 6th AAAI Conference on Human Computation and Crowdsourcing (HCOMP 2018). 2018. S. 78-82 (CEUR Workshop Proceedings; Band 2276).
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - LimitBias! measuring worker biases in the crowdsourced collection of subjective judgments
AU - Hube, Christoph
AU - Fetahu, Besnik
AU - Gadiraju, Ujwal
N1 - Funding information: Acknowledgments This work is funded by the ERC Advanced Grant ALEXANDRIA (grant no. 339233), DESIR (grant no. 31081), and H2020 AFEL project (grant no. 687916).
PY - 2018
Y1 - 2018
N2 - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.
AB - Crowdsourcing results acquired for tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) are affected by the inherent bias of the crowd workers. This leads to weaker and noisy ground-truth data. In this work we propose an approach for measuring crowd worker bias. We explore worker bias through the example task of bias detection where we compare the worker’s opinions with their annotations for specific topics. This is a first important step towards mitigating crowd worker bias in subjective tasks.
UR - http://www.scopus.com/inward/record.url?scp=85058941737&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85058941737
T3 - CEUR Workshop Proceedings
SP - 78
EP - 82
BT - SAD+CrowdBias 2018 Joint Proceedings SAD 2018 and CrowdBias 2018
T2 - SAD+CrowdBias 2018
Y2 - 5 July 2018 through 5 July 2018
ER -