Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3235765.3235796acmotherconferencesArticle/Chapter ViewAbstractPublication PagesfdgConference Proceedingsconference-collections
research-article
Open access

Comparing paid and volunteer recruitment in human computation games

Published: 07 August 2018 Publication History

Abstract

Paid platforms like Mechanical Turk are popular for recruiting players for playtesting and experiments. However, it is unclear if paid players have similar behavior or experiences as volunteers (i.e. players recruited for free through banner ads or game portals). In this work, we studied the impact of recruitment within human computation games, using two experiments. First, we compared voluntary recruitment versus paid recruitment with different compensation levels. We found that the highest paid players completed more levels (i.e. achieved a higher volume of completed tasks) and reported greater engagement than both volunteers and players paid less while volunteers completed levels of higher difficulty (i.e. achieved a higher quality of completed tasks) than paid players. Additionally, we also varied both recruitment strategy and the game's design and found no interaction effects, suggesting that while differences exist between volunteer and paid players, experimental changes do not impact those players differently.

References

[1]
Jonathan Barone, Colin Bayer, Rowan Copley, Nova Barlow, Matthew Burns, Sundipta Rao, Georg Seelig, Zoran Popović, Seth Cooper, and Nanocrafter Players. 2015. Nanocrafter: design and evaluation of a DNA nanotechnology game. In Proceedings of the 10th International Conference on the Foundations of Digital Games.
[2]
Tara S. Behrend, David J. Sharek, Adam W. Meade, and Eric N. Wiebe. 2011. The viability of crowdsourcing for survey research. Behavior Research Methods 43, 3 (Sept. 2011), 800--813.
[3]
Max Birk and Regan Mandryk. 2016. Crowdsourcing Player Experience Evaluation. In GDC Games User Research Summit 2016. San Francisco, CA, USA.
[4]
Max V. Birk, Maximilian A. Friehs, and Regan L. Mandryk. 2017. Age-based preferences and player experience: a crowdsourced cross-sectional study. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play (CHI PLAY '17). ACM, New York, NY, USA, 157--170.
[5]
Paul Cairns. 2016. Engagement in digital games. In Why Engagement Matters: Cross-Disciplinary Perspectives of User Engagement in Digital Media, Heather O'Brien and Paul Cairns (Eds.). Springer International Publishing, 81--104.
[6]
Stuart K. Card, George G. Robertson, and Jock D. Mackinlay. 1991. The Information Visualizer, an information workspace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '91). ACM, New York, NY, USA, 181--186.
[7]
Dana Chandler and Adam Kapelner. 2013. Breaking monotony with meaning: motivation in crowdsourcing markets. Journal of Economic Behavior & Organization 90 (June 2013), 123--133.
[8]
Emily A. Cooper and Hany Farid. 2016. Does the Sun revolve around the Earth? A comparison between the general public and online survey respondents in basic scientific knowledge. Public Understanding of Science 25, 2 (Feb. 2016), 146--153.
[9]
Seth Cooper, Firas Khatib, Adrien Treuille, Janos Barbero, Jeehyung Lee, Michael Beenen, Andrew Leaver-Fay, David Baker, Zoran Popović, and Foldit Players. 2010. Predicting protein structures with a multiplayer online game. Nature 466, 7307 (Aug. 2010), 756--760.
[10]
Matthew J. C. Crump, John V. McDonnell, and Todd M. Gureckis. 2013. Evaluating Amazon's Mechanical Turk as a tool for experimental behavioral research. PLOS ONE 8, 3 (March 2013), e57410.
[11]
Drew Dean, Sean Gaurino, Leonard Eusebi, Andrew Keplinger, Tim Pavlik, Ronald Watro, Aaron Cammarata, John Murray, Kelly McLaughlin, John Cheng, and Thomas Maddern. 2015. Lessons learned in game development for crowdsourced software formal verification. In Proceedings of the 2015 USENIX Summit on Gaming, Games, and Gamification in Security Education. USENIX Association, Washington, D.C.
[12]
Nicolas Kaufmann, Thimo Schulze, and Daniel Veit. 2011. More than fun and money. Worker motivation in crowdsourcing - a study on Mechanical Turk. In Proceedings of the Americas Conference on Information Systems.
[13]
Alexander Kawrykow, Gary Roumanis, Alfred Kam, Daniel Kwak, Clarence Leung, Chu Wu, Eleyine Zarour, Luis Sarmenta, Mathieu Blanchette, Jérôme Waldispühl, and Phylo Players. 2012. Phylo: a citizen science approach for improving multiple sequence alignment. PLOS ONE 7, 3 (March 2012), e31362.
[14]
Mohammad M. Khajah, Brett D. Roads, Robert V. Lindsey, Yun-En Liu, and Michael C. Mozer. 2016. Designing engaging games using Bayesian optimization. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 5571--5582.
[15]
Jinseop S. Kim, Matthew J. Greene, Aleksandar Zlateski, Kisuk Lee, Mark Richardson, Srinivas C. Turaga, Michael Purcaro, Matthew Balkam, Amy Robinson, Bardia F. Behabadi, Michael Campos, Winfried Denk, H. Sebastian Seung, and EyeWirers. 2014. Space-time wiring specificity supports direction selectivity in the retina. Nature 509, 7500 (May 2014), 331--336.
[16]
Markus Krause and René Kizilcec. 2015. To play or not to play: interactions between response quality and task complexity in games and paid crowdsourcing. In Proceddings of the Third AAAI Conference on Human Computation and Crowdsourcing.
[17]
Walter S. Lasecki, Jeffrey M. Rzeszotarski, Adam Marcus, and Jeffrey P. Bigham. 2015. The effects of sequence and delay on crowd work. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 1375--1378.
[18]
Heather Logas, Jim Whitehead, Michael Mateas, Richard Vallejos, Lauren Scott, Dan Shapiro, John Murray, Kate Compton, Joseph Osborn, Orlando Salvatore, Zhongpeng Lin, Huascar Sanchez, Michael Shavlovsky, Daniel Cetina, Shayne Clementi, and Chris Lewis. 2014. Software verification games: designing Xylem, The Code of Plants. In Proceedings of the 9th International Conference on the Foundations of Digital Games.
[19]
Andrew Mao, Ece Kamar, Yiling Chen, Eric Horvitz, Megan E. Schwamb, Chris J. Lintott, and Arfon M. Smith. 2013. Volunteering versus work for pay: incentives and tradeoffs in crowdsourcing. In Proceedings of the First AAAI Conference on Human Computation and Crowdsourcing.
[20]
Winter Mason and Siddharth Suri. 2012. Conducting behavioral research on Amazon's Mechanical Turk. Behavior Research Methods 44, 1 (March 2012), 1--23.
[21]
Winter Mason and Duncan J. Watts. 2009. Financial incentives and the "performance of crowds". In Proceedings of the ACM SIGKDD Workshop on Human Computation (HCOMP '09). ACM, Paris, France, 77--85.
[22]
Robert B. Miller. 1968. Response time in man-computer conversational transactions. In Proceedings of the December 9-11, 1968, Fall Joint Computer Conference, Part I (AFIPS '68 (Fall, part I)). ACM, New York, NY, USA, 267--277.
[23]
Gabriele Paolacci, Jesse Chandler, and Panagiotis G. Ipeirotis. 2010. Running experiments on Amazon Mechanical Turk. Judgment and Decision Making 5, 5 (June 2010), 411--419.
[24]
PlaytestCloud. 2018. https://www.playtestcloud.com/. (2018).
[25]
Johnmarshall Reeve. 2015. Understanding Motivation and Emotion (Sixth edition). Wiley, Hoboken, New Jersey.
[26]
Richard M. Ryan and Edward L. Deci. 2000. Intrinsic and extrinsic motivations: classic definitions and new directions. Contemporary Educational Psychology 25, 1 (Jan. 2000), 54--67.
[27]
Richard M. Ryan and Edward L. Deci. 2000. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist 55, 1 (2000), 68--78.
[28]
Anurag Sarkar and Seth Cooper. 2017. Level difficulty and player skill prediction in human computation games. In Proceedings of the 13th AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment.
[29]
Anurag Sarkar, Michael Williams, Sebastian Deterding, and Seth Cooper. 2017. Engagement Effects of Player Rating System-Based Matchmaking for Level Ordering in Human Computation Games. In Proceedings of the 12th International Conference on the Foundations of Digital Games. Hyannis, MA.
[30]
David Sharek and Eric Wiebe. 2014. Measuring video game engagement through the cognitive and affective dimensions. Simulation and Gaming 45, 4-5 (Aug. 2014), 569--592.
[31]
Jon Sprouse. 2011. A validation of Amazon Mechanical Turk for the collection of acceptability judgments in linguistic theory. Behavior Research Methods 43, 1 (2011), 155--167.
[32]
Tobias Sturn, Michael Wimmer, Carl Salk, Christoph Perger, Linda See, and Steffen Fritz. 2015. Cropland Capture - a game for improving global cropland maps. In Proceedings of the 10th International Conference on the Foundations of Digital Games.
[33]
Luis von Ahn and Laura Dabbish. 2004. Labeling images with a computer game. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Vienna, Austria, 319--326.
[34]
Eric N. Wiebe, Allison Lamb, Megan Hardy, and David Sharek. 2014. Measuring engagement in video game-based environments: investigation of the user engagement scale. Computers in Human Behavior 32 (March 2014), 123--132.
[35]
Michael Williams, Anurag Sarkar, and Seth Cooper. 2017. Predicting Human Computation Game Scores with Player Rating Systems. In Predicting Human Computation Game Scores with Player Rating Systems. In: Munekata N., Kunita I., Hoshino J. (eds) Entertainment Computing - ICEC 2017. ICEC 2017.
[36]
Jacob O. Wobbrock, Leah Findlater, Darren Gergle, and James J. Higgins. 2011. The Aligned Rank Transform for nonparametric factorial analyses using only ANOVA procedures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 143--146.

Cited By

View all
  • (2024)Self-Determination Theory and HCI Games Research: Unfulfilled Promises and Unquestioned ParadigmsACM Transactions on Computer-Human Interaction10.1145/367323031:3(1-74)Online publication date: 15-Jun-2024
  • (2022)Community AcknowledgmentProceedings of the ACM on Human-Computer Interaction10.1145/34928396:GROUP(1-18)Online publication date: 14-Jan-2022
  • (2022)Ordering Levels in Human Computation Games using Playtraces and Level Structure2022 IEEE Conference on Games (CoG)10.1109/CoG51982.2022.9893702(620-623)Online publication date: 21-Aug-2022
  • Show More Cited By

Index Terms

  1. Comparing paid and volunteer recruitment in human computation games

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    FDG '18: Proceedings of the 13th International Conference on the Foundations of Digital Games
    August 2018
    503 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 August 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. crowdsourcing
    2. human computation games
    3. paid recruitment
    4. player engagement
    5. volunteer recruitment

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    FDG '18
    FDG '18: Foundations of Digital Games 2018
    August 7 - 10, 2018
    Malmö, Sweden

    Acceptance Rates

    FDG '18 Paper Acceptance Rate 39 of 95 submissions, 41%;
    Overall Acceptance Rate 152 of 415 submissions, 37%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)91
    • Downloads (Last 6 weeks)13
    Reflects downloads up to 25 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Self-Determination Theory and HCI Games Research: Unfulfilled Promises and Unquestioned ParadigmsACM Transactions on Computer-Human Interaction10.1145/367323031:3(1-74)Online publication date: 15-Jun-2024
    • (2022)Community AcknowledgmentProceedings of the ACM on Human-Computer Interaction10.1145/34928396:GROUP(1-18)Online publication date: 14-Jan-2022
    • (2022)Ordering Levels in Human Computation Games using Playtraces and Level Structure2022 IEEE Conference on Games (CoG)10.1109/CoG51982.2022.9893702(620-623)Online publication date: 21-Aug-2022
    • (2021)PathfinderProceedings of the ACM on Human-Computer Interaction10.1145/34746915:CHI PLAY(1-23)Online publication date: 6-Oct-2021
    • (2021)An Online System for Player-vs-Level Matchmaking in Human Computation Games2021 IEEE Conference on Games (CoG)10.1109/CoG52621.2021.9619085(1-4)Online publication date: 17-Aug-2021
    • (2019)Using a Disjoint Skill Model for Game and Task Difficulty in Human Computation GamesExtended Abstracts of the Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts10.1145/3341215.3356310(661-669)Online publication date: 17-Oct-2019
    • (2019)Designing Videogames to Crowdsource Accelerometer Data Annotation for Activity Recognition ResearchProceedings of the Annual Symposium on Computer-Human Interaction in Play10.1145/3311350.3347153(135-147)Online publication date: 17-Oct-2019
    • (2019)Transforming Game Difficulty Curves using Function CompositionProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300781(1-7)Online publication date: 2-May-2019

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media