Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Toward Annotator Group Bias in Crowdsourcing

Haochen Liu, Joseph Thekinen, Sinem Mollaoglu, Da Tang, Ji Yang, Youlong Cheng, Hui Liu, Jiliang Tang


Abstract
Crowdsourcing has emerged as a popular approach for collecting annotated data to train supervised machine learning models. However, annotator bias can lead to defective annotations. Though there are a few works investigating individual annotator bias, the group effects in annotators are largely overlooked. In this work, we reveal that annotators within the same demographic group tend to show consistent group bias in annotation tasks and thus we conduct an initial study on annotator group bias. We first empirically verify the existence of annotator group bias in various real-world crowdsourcing datasets. Then, we develop a novel probabilistic graphical framework GroupAnno to capture annotator group bias with an extended Expectation Maximization (EM) algorithm. We conduct experiments on both synthetic and real-world datasets. Experimental results demonstrate the effectiveness of our model in modeling annotator group bias in label aggregation and model learning over competitive baselines.
Anthology ID:
2022.acl-long.126
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1797–1806
Language:
URL:
https://aclanthology.org/2022.acl-long.126
DOI:
10.18653/v1/2022.acl-long.126
Bibkey:
Cite (ACL):
Haochen Liu, Joseph Thekinen, Sinem Mollaoglu, Da Tang, Ji Yang, Youlong Cheng, Hui Liu, and Jiliang Tang. 2022. Toward Annotator Group Bias in Crowdsourcing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1797–1806, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Toward Annotator Group Bias in Crowdsourcing (Liu et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.126.pdf
Video:
 https://aclanthology.org/2022.acl-long.126.mp4