CrowdHub: Extending crowdsourcing platforms for the controlled evaluation of tasks designs

J Ramírez, S Degiacomi, D Zanella, M Báez… - arXiv preprint arXiv …, 2019 - arxiv.org
arXiv preprint arXiv:1909.02800, 2019arxiv.org
We present CrowdHub, a tool for running systematic evaluations of task designs on top of
crowdsourcing platforms. The goal is to support the evaluation process, avoiding potential
experimental biases that, according to our empirical studies, can amount to 38% loss in the
utility of the collected dataset in uncontrolled settings. Using CrowdHub, researchers can
map their experimental design and automate the complex process of managing task
execution over time while controlling for returning workers and crowd demographics, thus …
We present CrowdHub, a tool for running systematic evaluations of task designs on top of crowdsourcing platforms. The goal is to support the evaluation process, avoiding potential experimental biases that, according to our empirical studies, can amount to 38% loss in the utility of the collected dataset in uncontrolled settings. Using CrowdHub, researchers can map their experimental design and automate the complex process of managing task execution over time while controlling for returning workers and crowd demographics, thus reducing bias, increasing utility of collected data, and making more efficient use of a limited pool of subjects.
arxiv.org