Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3371238.3371247acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiccseConference Proceedingsconference-collections
research-article

The Key Considerations In Building A Crowd-testing Platform For Software Developers

Published: 18 October 2019 Publication History

Abstract

External testing of mobile software on a larger number of mobile devices by several users is often needed to ensure quality. Currently, the evidence as to what extent developers accept large-scale crowd-testing is limited. This paper aims to (1) gauge developers' perspectives with respect to the participation of the public and anonymous crowd testers, with varied experiences; (2) gather the developers' needs that could reduce their concerns of dealing with the public crowd testers and increase the opportunity of using the crowd-testing platforms. An online exploratory survey, conducted to included 50 Android and iOS developers from different countries with diverse experiences. This paper revealed several findings including the information that must be provided by developers and crowd testers for achieving effective crowd-testing process; the factors that can ensure the reliability and accuracy of the results provided by the public crowd testers. The findings conclude that (90%) of developers are potentially willing to perform testing via the public crowd testers worldwide. This on condition that several fundamental features were available which enable them to perform more realistic tests without artificial environments on large numbers of devices. The results also demonstrated that a group of developers does not consider testing as a serious job that they have to pay for, which can affect the gig-economy and global market. We aim at helping the individual or small development teams who have limited resources to perform large-scale testing of their products.

References

[1]
I. Alvertis, S. Koussouris, D. Papaspyros, E. Arvanitakis, S. Mouzakitis, S. Franken, S. Kolvenbach, and W. Prinz. User involvement in software development processes. Procedia Computer Science, 97:73--83, 2016.
[2]
I. Bayley, D. Flood, R. Harrison, and C. Martin. Mobitest: a cross-platform tool for testing mobile applications. 2012.
[3]
A. Bryman and E. Bell. Business research methods. Oxford University Press, USA, 2015.
[4]
A. Bryman, B. Burgess, et al. Analyzing qualitative data. Routledge, 2002.
[5]
M. Denscombe. The good research guide: for small-scale social research projects. McGraw-Hill Education (UK), 2014.
[6]
F. Guaiani and H. Muccini. Crowd and laboratory testing can they co-exist?: an exploratory study. In Proceedings of the second international workshop on CrowdSourcing in software engineering, pages 32--37. IEEE Press, 2015.
[7]
J.-f. Huang. Appacts: Mobile app automated compatibility testing service. In Mobile Cloud Computing, Services, and Engineering (MobileCloud), 2014 2nd IEEE International Conference on, pages 85--90. IEEE, 2014.
[8]
J.-f. Huang and Y.-z. Gong. Remote mobile test system: a mobile phone cloud for application testing. In Cloud Computing Technology and Science (CloudCom), 2012 IEEE 4th International Conference on, pages 1--4. IEEE, 2012.
[9]
J. Kaasila, D. Ferreira, V. Kostakos, and T. Ojala. Testdroid: automated remote ui testing on android. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia, page 28. ACM, 2012.
[10]
Q. Naith and F. Ciravegna. Hybrid crowd-powered approach for compatibility testing of mobile devices and applications. In Proceedings of the 3rd International Conference on Crowd Science and Engineering, page 1. ACM, 2018.
[11]
Q. Naith and F. Ciravegna. Mobile devices compatibility testing strategy via crowdsourcing. International Journal of Crowd Science, 2(3):225--246, 2018.
[12]
C. M. Prathibhan, A. Malini, N. Venkatesh, and K. Sundarakantham. An automated testing framework for testing android mobile applications in the cloud. In Advanced Communication Control and Computing Technologies (ICACCCT), 2014 International Conference on, pages 1216--1219. IEEE, 2014.
[13]
M. B. Rosson and J. M. Carroll. Usability engineering: scenario-based development of human-computer interaction. Morgan Kaufmann, 2002.
[14]
O. Starov. Cloud platform for research crowdsourcing in mobile testing. 2013.
[15]
B. Vasilescu, V. Filkov, and A. Serebrenik. Stackoverflow and github: Associations between software development and crowdsourced knowledge. In Social computing (SocialCom), 2013 international conference on, pages 188--195. IEEE, 2013.

Cited By

View all
  • (2020)Definitive guidelines toward effective mobile devices crowdtesting methodologyInternational Journal of Crowd Science10.1108/IJCS-01-2020-00024:2(209-228)Online publication date: 16-Apr-2020

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICCSE'19: Proceedings of the 4th International Conference on Crowd Science and Engineering
October 2019
246 pages
ISBN:9781450376402
DOI:10.1145/3371238
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 October 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Gig-economy
  2. Large-scale crowd-testing
  3. Mobile App testing
  4. Public and Anonymous Crowd Testers

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICCSE'19

Acceptance Rates

ICCSE'19 Paper Acceptance Rate 35 of 92 submissions, 38%;
Overall Acceptance Rate 92 of 247 submissions, 37%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2020)Definitive guidelines toward effective mobile devices crowdtesting methodologyInternational Journal of Crowd Science10.1108/IJCS-01-2020-00024:2(209-228)Online publication date: 16-Apr-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media