Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3387940.3392243acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
short-paper

Strategies for Crowdworkers to Overcome Barriers in Competition-based Software Crowdsourcing Development

Published: 25 September 2020 Publication History

Abstract

Crowdsourcing in software development uses a large pool of developers on-demand to outsource parts or the entire software project to a crowd. To succeed, this requires a continuous influx of developers, or simply crowdworkers. However, crowdworkers face many barriers when attempting to participate in software crowdsourcing. Often, these barriers lead to a low number and poor quality of submitted solutions. In our previous work, we identified several barriers faced by crowdworkers including finding a task according to his/her abilities, setting up the environment to perform the task, and managing one's personal time. We also proposed six strategies to overcome or minimize these barriers. In this paper, these six strategies are evaluated questioning Software Crowdsourcing (SW CS) experts. The results show that software crowdsourcing needs to: (i) provide a system that helps matching tasks requirements and crowdworker's profile; (ii) adopt containers or virtual machines to help crowdworkers set up their environment to perform the task, (iii) plan and control crowdworkers' personal time, and (iv) adopt communication channels to allow crowdworkers to clarify questions about the requirements and, as a consequence, finish the tasks.

References

[1]
K. Stol and B. Fitzgerald, "Two's company, three's a crowd: a case study of crowdsourcing software development," in 36th International Conference on Software Engineering, 2014.
[2]
Y. Yang, M. R. Karim, R. Saremi and G. Ruhe, "Who Should Take This Task?: Dynamic Decision Support for Crowd Workers," in 10th International Symposium on Empirical Software Engineering and Measurement, 2016.
[3]
M. Hosseini, K. Phalp, J. Taylor and A. Raian, "The four pillars of crowdsourcing: A reference model," in 8th International Conference on Research Challenges in Information Science, 2014.
[4]
A. L. Zanatta, I. Steinmacher, L. Machado, C. R. d. Souza and R. Prikladnicki, "Barriers Faced by Newcomers in Software Crowdsourcing Projects," IEEE Software, vol. 34, no. 2, pp. 37--43, Mar 2017.
[5]
A. L. Zanatta, L. S. Machado and I. Steinmacher, "Competence, Collaboration and Time Management: Barriers and Recommendations for crowdworkers," in 5th International Workshop on Crowd Sourcing in Software Engineering, Gotemburgo, 2018.
[6]
L. S. Machado, A. L. Zanatta, S. Marczak and R. Prikladnicki, "The Good, the Bad and the Ugly: An Onboard Journey in Software Crowdsourcing Competitive Model," in 4th International Workshop on Crowd Sourcing in Software Engineering, Buenos Aires, 2017.
[7]
D. S. Collingridge and E. E. Gantt, "The quality of qualitative research," American journal of medical quality, vol. 23, no. 5, pp. 389--395, 2008.
[8]
S. Dustdar and M. Gaedke, "The social routing principle," Internet Computing, vol. 15, no. 4, pp. 80--83, Jul 2011.
[9]
C. Fershtman and U. Gneezy, "The tradeoff between performance and quitting in high power tournaments," Journal of the European Economic Association, vol. 9, no. 2, pp. 318--336, Abr 2011.
[10]
J. J. Horton and L. B. Chilton, "The labor economics of paid crowdsourcing," in 11th ACM Conference on Electronic Commerce, 2010.
[11]
L. Machado, J. Kroll, R. Prikladnicki, C. R. de Souza and E. Carmel, "Software crowdsourcing challenges in the Brazilian IT Industry," in 18th International Conference on Enterprise Information Systems, Rome, 2016.
[12]
L. Vaz, I. Steinmacher and S. Marczak, "An Empirical Study on Task Documentation in Software Crowdsourcing on TopCoder," ACM/IEEE 14th International Conference on Global Software Engineering, pp. 48--57, May 2019.
[13]
E. Bari, M. Johnston, W. Wu and W. T. Tsai, "Software Crowdsourcing Practices and Research Directions," in Service-Oriented System Engineering, 2016.
[14]
I. Steinmacher, M. A. G. Silva, M. A. Gerosa and D. F. Redmiles, "A systematic literature review on the barriers faced by newcomers to open source software projects," Information and Software Technology, vol. 59, no. C, pp. 67--85, Mar 2015.
[15]
Y. Park and C. Jensen, "Beyond pretty pictures: Examining the benefits of code visualization for open source newcomers," in 5th IEEE International Workshop Visualizing Software for Understanding and Analysis, 2009.
[16]
S. Panichella, G. Bavota, M. D. Penta, G. Canfora and G. Antoniol, "How developers' collaborations identified from different sources tell us about code changes," in IEEE international Conference on Software Maintenance and Evolution, 2014.

Cited By

View all
  • (2024)Machine learning based software effort estimation using development-centric features for crowdsourcing platformIntelligent Data Analysis10.3233/IDA-23736628:1(299-329)Online publication date: 3-Feb-2024

Index Terms

  1. Strategies for Crowdworkers to Overcome Barriers in Competition-based Software Crowdsourcing Development

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICSEW'20: Proceedings of the IEEE/ACM 42nd International Conference on Software Engineering Workshops
    June 2020
    831 pages
    ISBN:9781450379632
    DOI:10.1145/3387940
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 September 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Software crowdsourcing
    2. barriers
    3. strategies

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    ICSE '20
    Sponsor:
    ICSE '20: 42nd International Conference on Software Engineering
    June 27 - July 19, 2020
    Seoul, Republic of Korea

    Upcoming Conference

    ICSE 2025

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)2
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 26 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Machine learning based software effort estimation using development-centric features for crowdsourcing platformIntelligent Data Analysis10.3233/IDA-23736628:1(299-329)Online publication date: 3-Feb-2024

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media