Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2499149.2499168acmconferencesArticle/Chapter ViewAbstractPublication PageschitalyConference Proceedingsconference-collections
research-article

Keep it simple: reward and task design in crowdsourcing

Published: 16 September 2013 Publication History
  • Get Citation Alerts
  • Abstract

    Crowdsourcing is emerging as an effective method for performing tasks that require human abilities, such as tagging photos, transcribing handwriting and categorising data. Crowd workers perform small chunks of larger tasks in return for a reward, which is generally monetary. Reward can be one factor for motivating workers to produce higher quality results. Yet, as highlighted by previous research, the task design, in terms of its instructions and user interface, can also affect the workers' perception of the task, thus affecting the quality of results. In this study we investigate both factors, reward and task design, to better understand their role in relation to the quality of work in crowdsourcing. In Experiment 1 we test a variety of reward schemas while in Experiment 2 we measure the effects of the complexity of tasks and interface on attention. The long-term goal is to establish guidelines for designing tasks with the aim to maximize workers' performance.

    References

    [1]
    Chipman, S. F., Schraagen, J. M., & Shalin, V. L. Introduction to cognitive task analysis. In J. M. Schraagen et.al. (Eds.), Cognitive task analysis (pp. 3--23). Mahwah, NJ: Lawrence Erlbaum Associates (2000).
    [2]
    Crump MJC, McDonnell JV, Gureckis TM Evaluating Amazon's Mechanical Turk as a Tool for Experimental Behavioral Research. PLoS ONE 8 (3). (2013).
    [3]
    Dow, S., Kulkarni, A. P., Bunge, B., Nguyen, T., Klemmer, S. R., and Hartmann, B. Shepherding the crowd: managing and providing feedback to crowd workers. In CHI Extend. Abstracts (2011), 1669--1674.
    [4]
    Heer, J. and Bostock, M. Crowdsourcing AMT graphical perception: using mechanical turk to assess visualization design. In CHI (2010), 203--212.
    [5]
    Howe, J. The rise of crowdsourcing Wired Magazine 14 (6). (2006)
    [6]
    Kaufmann, N., and Schulze, T. Worker motivation in crowdsourcing and human computation. Education 17, (2011).
    [7]
    Kieras, D. and Polson P. G., An approach to the formal analysis of user complexity. International Journal of Man-Machine Studies, 22 (1985), 365--394.
    [8]
    Kittur, A. Crowdsourcing, Collaboration and Creativity. XRDS, 17 (2010), 22--26.
    [9]
    Kittur, A. Nickerson, J V. Bernstein, Ml Gerber, E. Shaw, A. Zimmerman, J. Lease, M. and Horton, J. The future of crowd work. In CSCW (2013)., 1301--1318.
    [10]
    Komarov, S. Reinecke, K. and Gajos K. Z. Crowdsourcing Performance Evaluations of User Interfaces. In CHI (2013), 207--216.
    [11]
    Mason, W. A., and Watts, D. J. Financial incentives and the "performance of crowds". In KDD Workshop on Human Computation (2009), 77--85.
    [12]
    Rogstadius, J., Kostakos, V., Kittur, A., Smus, B., Laredo, J., and Vukovic, M. An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In ICWSM (2011).
    [13]
    Shaw, A. D., Horton, J. J., and Chen, D. L. Designing incentives for inexpert human raters. In CSCW (2011), 275--284.
    [14]
    Sweller, J. Cognitive Load During Problem Solving: Effects on Learning. Cognitive Science, 12 (1988) 257--85.

    Cited By

    View all
    • (2024)A.I. Robustness: a Human-Centered Perspective on Technological Challenges and OpportunitiesACM Computing Surveys10.1145/3665926Online publication date: 27-May-2024
    • (2024)If in a Crowdsourced Data Annotation Pipeline, a GPT-4Proceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642834(1-25)Online publication date: 11-May-2024
    • (2023)How does Value Similarity affect Human Reliance in AI-Assisted Ethical Decision Making?Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society10.1145/3600211.3604709(49-57)Online publication date: 8-Aug-2023
    • Show More Cited By

    Index Terms

    1. Keep it simple: reward and task design in crowdsourcing

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHItaly '13: Proceedings of the Biannual Conference of the Italian Chapter of SIGCHI
      September 2013
      215 pages
      ISBN:9781450320610
      DOI:10.1145/2499149
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 16 September 2013

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. cognition
      2. crowdsourcing
      3. motivation
      4. user interface

      Qualifiers

      • Research-article

      Conference

      CHItaly '13
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 109 of 242 submissions, 45%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)42
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 11 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)A.I. Robustness: a Human-Centered Perspective on Technological Challenges and OpportunitiesACM Computing Surveys10.1145/3665926Online publication date: 27-May-2024
      • (2024)If in a Crowdsourced Data Annotation Pipeline, a GPT-4Proceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642834(1-25)Online publication date: 11-May-2024
      • (2023)How does Value Similarity affect Human Reliance in AI-Assisted Ethical Decision Making?Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society10.1145/3600211.3604709(49-57)Online publication date: 8-Aug-2023
      • (2023)Diverse Perspectives Can Mitigate Political Bias in Crowdsourced Content ModerationProceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency10.1145/3593013.3594080(1280-1291)Online publication date: 12-Jun-2023
      • (2023)Supporting Requesters in Writing Clear Crowdsourcing Task Descriptions Through Computational Flaw AssessmentProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584039(737-749)Online publication date: 27-Mar-2023
      • (2023)Cognitive personalization for online microtask labor platforms: A systematic literature reviewUser Modeling and User-Adapted Interaction10.1007/s11257-023-09383-w34:3(617-658)Online publication date: 19-Sep-2023
      • (2023)The Dark Side of Recruitment in Crowdsourcing: Ethics and Transparency in Micro-Task MarketplacesComputer Supported Cooperative Work (CSCW)10.1007/s10606-023-09464-932:3(439-474)Online publication date: 28-Jul-2023
      • (2022)The Influences of Task Design on Crowdsourced Judgement: A Case Study of Recidivism Risk EvaluationProceedings of the ACM Web Conference 202210.1145/3485447.3512239(1685-1696)Online publication date: 25-Apr-2022
      • (2022)What Quality Control Mechanisms do We Need for High-Quality Crowd Work?IEEE Access10.1109/ACCESS.2022.320729210(99709-99723)Online publication date: 2022
      • (2021)Perceived Complexity of a Project’s Optimal Work Plan Influences Its Likelihood of Adoption by Project ManagersProject Management Journal10.1177/87569728211026509(875697282110265)Online publication date: 16-Jul-2021
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media