Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3474624.3474649acmotherconferencesArticle/Chapter ViewAbstractPublication PagessbesConference Proceedingsconference-collections
short-paper

Human-Oriented Software Engineering Experiments: The Large Gap in Experiment Reports

Published: 05 October 2021 Publication History

Abstract

Context: The lack of information in experiment reports impairs external replications and decreases experiment quality and reliability, causing a lack of knowledge dissemination and making it impossible to confirm the results, despite the existence of well-known guidelines for planning, conducting, and reporting experiments. Objective: Provide an in-depth study on how information from human-oriented controlled experiments in software engineering is reported after the emergence of supporting guidelines. Method: A systematic mapping study was conducted in the main empirical software engineering and software engineering venues, considering the period following the supporting guidelines publication. Results: We analyzed 412 articles from three conferences and three journals reporting experiments where we did not find crucial information about the experiments in most of them. Examples of such information are participant reward, target population, hypothesis, and conclusion and construct validity. Conclusion: There is a gap between the information the guidelines suggest reporting and what is reported. From 27 elements that should be on the reports, according to the guidelines, 65% of the analyzed articles failed to report at least 13 (almost half). Such finding opposes the natural intuition that with the appearance and maturation of guidelines, studies’ reports would increasingly comply with them over the years. As a consequence, a flawed report may raise doubts about the quality and validity of the study.

References

[1]
Claes Wohlin 2012. Experimentation in software engineering. Springer Science & Business Media.
[2]
Dag I. K. SJøberg 2005. A survey of controlled experiments in software engineering. IEEE Transactions on Software Engineering, v. 31, n. 9, p. 733-753.
[3]
Larissa Falcao 2015. An analysis of software engineering experiments using human subjects. 2015 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM). IEEE.
[4]
Barbara A. Kitchenham, Stuart Charters. 2007. Guidelines for performing Systematic Literature Reviews in Software Engineering.
[5]
Andreas Jedlitschka, Marcus Ciolkowski, Dietmar Pfahl. 2008. “Reporting Experiments in Software Engineering,” Guide to Advanced Empirical Software Eng., F. Shull, J. Singer, and D.I.K. Sjøberg, eds., Springer- Verlag.
[6]
Natalia Juristo, Ana M. Moreno. 2001. Basics of software engineering experimentation. New York: Springer Science & Business Media.
[7]
Vigdis B. Kampenes 2007. A systematic review of effect size in software engineering experiments. Information and Software Technology, v. 49, n. 11, p. 1073-1086.
[8]
Barbara Kitchenham 2013. Trends in the Quality of Human-Centric Software Engineering Experiments–A Quasi-Experiment. IEEE Transactions on Software Engineering, v. 39, n. 7, 1002-1017.
[9]
Kai Petersen 2008. Systematic mapping studies in software engineering. In: International Conference On Evaluation And Assessment In Software Engineering, 12., 2008, Bari. Proceedings... Swinton: British Computer Society, 1-10.
[10]
Cleyton V. Magalhães, Fabio Q. B. Silva, Ronnie E. Santos. 2014. Investigations about replication of empirical studies in software engineering: preliminary findings from a mapping study. In: INTERNATIONAL CONFERENCE ON EVALUATION AND ASSESSMENT IN SOFTWARE ENGINEERING, 18., 2014, London. Proceedings...ACM, p. 37.

Cited By

View all
  • (2024)On the Measures of Success in Replication of Controlled Experiments with STRIDEInternational Journal of Software Engineering and Knowledge Engineering10.1142/S021819402350065134:04(623-650)Online publication date: 23-May-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
SBES '21: Proceedings of the XXXV Brazilian Symposium on Software Engineering
September 2021
473 pages
ISBN:9781450390613
DOI:10.1145/3474624
© 2021 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Controlled Experiments
  2. Empirical Software Engineering
  3. Experiment Report
  4. Systematic Mapping Study

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

Conference

SBES '21
SBES '21: Brazilian Symposium on Software Engineering
September 27 - October 1, 2021
Joinville, Brazil

Acceptance Rates

Overall Acceptance Rate 147 of 427 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)On the Measures of Success in Replication of Controlled Experiments with STRIDEInternational Journal of Software Engineering and Knowledge Engineering10.1142/S021819402350065134:04(623-650)Online publication date: 23-May-2024

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media