Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Managing the crowds: the effect of prize guarantees and in-process feedback on participation in crowdsourcing contests

Published: 01 March 2019 Publication History

Abstract

Crowdsourcing contests are contests by which organizations tap into the wisdom of crowds by outsourcing tasks to large groups of people on the Internet. In an online environment often characterized by anonymity and lack of trust, there are inherent uncertainties for participants of such contests. This study focuses on crowd-sourcing contests with winner-take-all prizes. During these contests, submissions are made sequentially and contest hosts can provide public in-process feedback to the submissions as soon as they are received. Drawing on the uncertainty literature, we examine how the use of prize guarantees (guaranteeing that a winner will be picked and paid) and in-process feedback (numeric ratings to individual designs and public textual comments during the contest) can help attract more submissions by influencing the various uncertainties faced by the contestants. We find that guaranteeing the prize increases submissions. The volume of in-process feedback (both numeric reviews and textual comments) has a positive effect on the number of submissions, and such an effect is bigger in contests without prize guarantees. In addition, providing highly positive or extremely negative feedback discourages overall future submissions, and the negative effect of highly positive feedback is mitigated in guaranteed contests.

References

[1]
Ackerlof, G. A. 1970. "The Market for 'Lemons': Quality Uncertainty and the Market Mechanism." The Quarterly Journal of Economics (84:3), pp. 488-500.
[2]
Albuquerque, P., and Bronnenberg, B. J. 2009. "Estimating Demand heterogeneity Using Aggregated Data: An Application to the Frozen Pizza Catgory," Marketing Science (28:2), pp. 356-372.
[3]
Araujo, R. M. 2013. "99designs: An Analysis of Creative Competition in Crowdsourced Design," in Proceedings of the First AAAI Conference on Human Computation and Crowd-sourcing, Palm Springs, CA, pp. 17-24.
[4]
Bockstedt, J., Druehl, C., and Mishra, A. 2016. "Heterogeneous Submission Behavior and Its Implications for Success in Innovation Contests with Public Submissions," Production and Operations Management.
[5]
Boudreau, K. J., Lacetera, N., and Lakhani, K. R. 2011. "Incentives and Problem Uncertainty in Innovation Contests: An Empirical Analysis," Management Science (57:5), pp. 843-863.
[6]
Boudreau, K. J., and Lakhani, K. 2013. "Cumulative Innovation and Open Disclosure of Intermediate Results: Evidence from a Policy Experiment in Bioinformatics," Working Paper, Harvard Business School.
[7]
Brabham, D. C. 2010. "Moving the Crowd at Threadless," Information, Communication & Society (13:8), pp. 1122-1145.
[8]
Chevalier, J. A., and Mayzlin, D. 2006. "The Effect of Word of Mouth on Sales: Online Book Reviews," Journal of Marketing Research (43:3), pp. 345-354.
[9]
Chiles, T. H., and McMackin, J. F. 1996. "Integrating Variable Risk Preference, Trust, and Transaction Cost Economics," Academy of Management Review (21:1), pp. 73-99.
[10]
Daft, R. L., and Lengel, R. H. 1986. "Organizational Information Requirements, Media Richness and Structural Design," Management Science (32:5), pp. 554-571.
[11]
Deng, X., Joshi, K. D., and Galliers, R. D. 2016. "The Duality of Empowerment and Marginalization in Microtask Crowdsourcing: Giving Voice to the less Powerful Through Value Sensitive Design," MIS Quarterly (40:2), pp. 279-302.
[12]
Dimoka, A., Hong, Y., and Pavlou, P. A. 2012. "On Product Uncertainty in Online Markets: Theory and Evidence," MIS Quarterly (32:2), pp. 395-426.
[13]
Duan, W., Gu, B., and Whinston, A. 2009. "Informational Cascades and Software Adoption on the Internet: An Empirical Investigation," MIS Quarterly (33:1), pp. 23-48.
[14]
Ducarroz, C., Yang, S., and Greenleaf, E. A. 2016. "Understanding the Impact of In-Process Promotional Messages: An Application to Online Auctions," Journal of Marketing (80:2), pp. 80-100.
[15]
Gill, D., and Prowse, V. 2012. "A Structural Analysis of Disappointment Aversion in a Real Effort Competition," American Economic Review (102:1), pp. 469-503.
[16]
Girotra, K., Terwiesch, C., and Ulrich, K. T. 2010. "Idea Generation and the Quality of the Best Idea," Management Science (56:4), pp. 591-605.
[17]
Gross, D. P. 2017. "Performance Feedback in Competitive Product Development," RAND Journal of Economics (48:2), pp. 438-466.
[18]
Hong, Y. K., and Pavlou, P. A. 2014. "Product Fit Uncertainty in Online Markets: Nature, Effects, and Antecedents," Information Systems Research, (25:2), pp. 328-344.
[19]
Hong, Y., Wang, C., and Pavlou, P. A. 2016. "Comparing Open and Sealed Bid Auctions: Evidence from Online Labor Markets," Information Systems Research (27:1), pp. 49-69.
[20]
Howe, J. 2008. Crowdsourcing: How the Power of the Crowd Is Driving the Future of Business, New York: Random House.
[21]
Huang, Y., Singh, P., and Mukhopadhyay, T. 2012. "How to Design Crowdsourcing Contest: A Structural Empirical Analysis," paper presented at the Workshop on Information Systems and Economics, Orlando, FL.
[22]
Irani, L. C., and Silberman, M. S. 2013. "Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, pp. 611-620.
[23]
Jian, L., Li, Z., and Liu, T. X. 2017. "Simultaneous Versus Sequential All-Pay Auctions: An Experimental Study," Experimental Economics (20:3), pp. 648-669.
[24]
Jiang, Z. Z., Huang, Y., and Beil, D. R. 2016. "The Role of Feedback in Dynamic Crowdsourcing Contests: A Structural Empirical Analysis," Ross School of Business Paper No. 1334, University of Michigan.
[25]
Kahneman, D., and Tversky, A. 1979. "Prospect Theory: An Analysis of Decision Under Risk," Econometrica (47), pp. 263-291.
[26]
Kim, Y., and Krishnan, R. 2015. "On Product-Level Uncertainty and Online Purchase Behavior: An Empirical Analysis," Management Science (61:10), pp. 2449-2467.
[27]
Konrad, K. A., and Leininger, W. 2007. "The Generalized Stackelberg Equilibrium of the All-Pay Auction with Complete Information," Review of Economic Design (11:2), pp. 165-174.
[28]
Lakhani, K. R., Jeppesen, L. B., Lohse, P. A., and Panetta, J. A. 2007. "The Value of Openness in Scientific Problem Solving," Working Paper, Harvard Business School.
[29]
Liu, T. X., Yang, J., Adamic, L. A., and Chen, Y. 2014. "Crowd-sourcing with All-Pay Auctions: A Field Experiment on Taskcn," Management Science (60:8), pp. 2020-2037.
[30]
Mo, J., Sarkar, S., and Menon, S. 2018. "Know When to Run: Recommendations in Crowdsourcing Contests," MIS Quarterly (42:3), pp. 919-944.
[31]
Norton, E. C., Wang, H., and Ai, C. 2004. "Computing Interaction Effects and Standard Errors in Logit and Probit Models," Stata Journal (4:2), pp. 154-167.
[32]
Osborn, A. F. 1953. Applied Imagination, New York: Charles Scribner's Sons.
[33]
Pavlou, P. A., Liang, H., and Xue, Y. 2007. Understanding and Mitigating Uncertainty in Online Exchange Relationships: A Principal–Agent Perspective," MIS Quarterly (31:1), pp. 105-136.
[34]
Pfeffer, J., and Salancik, G. 1978. The External Control of Organizations: A Resource Dependence Perspective, New York: Harper & Row.
[35]
Rubin, D. B. 2001. "Using Propensity Scores to Help Design Observational Studies: Application to the Tobacco Litigation," Health Services and Outcomes Research Methodology, (2:3), pp. 169-188.
[36]
Segev, E., and Sela, A. 2014. "Multi-Stage Sequential All-Pay Auctions," European Economic Review (70), pp. 371-382.
[37]
Silberman, M. S., Ross, J., Irani, L., and Tomlinson, B. 2010. "Sellers' Problems in Human Computation Markets," in Proceedings of the ACM SIGKDD Workshop on Human Computation, Washington, DC, pp. 18-21.
[38]
Simon, H. A. 1973. "The Structure of Ill Structured Problems," Artificial Intelligence (4), 4, pp. 181-201.
[39]
Terwiesch, C., and Xu, Y. 2008. "Innovation Contests, Open Innovation, and Multiagent Problem Solving," Management Science (54:9), pp. 1529-1543.
[40]
von Hippel, E., and Tyre, M. J. 1995. "How Learning by Doing Is Done: Problem Identification in Novel Process Equipment," Research Policy (24:1), pp. 1-12.
[41]
Walter, T., and Back, A. 2011. "Towards Measuring Crowd-sourcing Success: An Empirical Study on Effects of External Factors in Online Idea Contest," in Proceedings of the 6th Mediterranean Conference on Information Systems, Limassol, Cyprus.
[42]
Williamson, O. E. 1975. Markets and Hierarchies: Analysis and Antitrust Implications, New York: The Free Press.
[43]
Williamson, O. E. 1985. The Economic Institutions of Capitalism: Firms, Markets, Relational Contracting, New York: The Free Press.
[44]
Williamson, O. E. 1996. The Mechanisms of Governance, New York: The Free Press.
[45]
Wooten, J. O., and Ulrich, K. T. 2015. "The Impact of Visibility in Innovation Tournaments: Evidence from Field Experiments," University of Pennsylvania Scholarly Commons.
[46]
Wooten, J. O., and Ulrich, K. T. 2016. "Idea Generation and the Role of Feedback: Evidence from Field Experiments with Innovation Tournaments," Production and Operations Management (26:1), pp. 80-99.
[47]
Yang, Y., Chen, P.-Y., and Banker, R. 2010. "Impact of Past Performance and Strategic Bidding on Winner Determination of Open Innovation Contest," in Proceedings of the 21st Workshop on Information Systems and Economics, pp. 11-12.
[48]
Yang, Y., Chen, P.-Y., and Pavlou, P. A. 2013. "Managing Open Innovation Contests in Online Market," Working Paper, Temple University, Philadelphia, PA.

Cited By

View all
  • (2024)Idea generation performance in open innovation communitiesInformation and Management10.1016/j.im.2024.10393061:3Online publication date: 2-Jul-2024
  • (2023)More is better? Understanding the effects of online interactions on patients health anxietyJournal of the Association for Information Science and Technology10.1002/asi.2482274:11(1243-1264)Online publication date: 3-Oct-2023
  • (2022)Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing ContestsInformation Systems Research10.1287/isre.2021.105433:1(265-284)Online publication date: 1-Mar-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image MIS Quarterly
MIS Quarterly  Volume 43, Issue 1
March 2019
665 pages
ISSN:0276-7783
Issue’s Table of Contents

Publisher

Society for Information Management and The Management Information Systems Research Center

United States

Publication History

Published: 01 March 2019

Author Tags

  1. crowdsourcing contest
  2. feedback
  3. participation
  4. prize guarantees
  5. uncertainty

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Idea generation performance in open innovation communitiesInformation and Management10.1016/j.im.2024.10393061:3Online publication date: 2-Jul-2024
  • (2023)More is better? Understanding the effects of online interactions on patients health anxietyJournal of the Association for Information Science and Technology10.1002/asi.2482274:11(1243-1264)Online publication date: 3-Oct-2023
  • (2022)Seeker Exemplars and Quantitative Ideation Outcomes in Crowdsourcing ContestsInformation Systems Research10.1287/isre.2021.105433:1(265-284)Online publication date: 1-Mar-2022
  • (2022)Task navigation panel for Amazon Mechanical TurkProceedings of the 5th International Conference on Computer Science and Software Engineering10.1145/3569966.3570108(574-580)Online publication date: 21-Oct-2022
  • (2021)Winning by Learning? Effect of Knowledge Sharing in Crowdsourcing ContestsInformation Systems Research10.1287/isre.2020.098232:3(836-859)Online publication date: 1-Sep-2021

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media