Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3629479.3629485acmotherconferencesArticle/Chapter ViewAbstractPublication PagessbqsConference Proceedingsconference-collections
research-article

Do you see any problem? On the Developers Perceptions in Test Smells Detection

Published: 06 December 2023 Publication History

Abstract

Developers are continuously implementing changes to meet demands coming from users. In the context of test-driven development, before any new code is added, a test case should be written to make sure new changes do not introduce bugs. During this process, developers and testers might adopt bad design choices, which may lead to the introduction of the so-called Test Smells in the code. Test Smells are bad solutions for implementing or designing test code. We perform a broader study to investigate the participants’ perceptions about the presence of Test Smells. We analyze whether certain factors related to the participant’ profiles concerning background and experience may influence their perception of Test Smells. Also, we analyze if the heuristics adopted by developers influence their perceptions about the existence of Test Smells. We analyze commits of open source projects to identify the introduction of Test Smells. Then, we conduct an empirical study with 25 participants that evaluate instances of 10 different smell types. For each Test Smell type, we analyze the agreement among participants, and we assess the influence of different factors on the participants’ evaluations. Altogether, more than 1250 evaluations were made. The results indicate that participants present a low agreement on detecting all 10 Test Smells types analyzed in our study. The results also suggest that factors related to background and experience do not have a consistent effect on the agreement among the participants. On the other hand, the results indicate that the agreement is consistently influenced by specific heuristics employed by participants. Our findings reveal that the participants detect Test Smells in significantly different ways. As a consequence, these findings introduce some questions concerning the results of previous studies that do not consider the different perceptions of participants on detecting Test Smells.

References

[1]
Gabriele Bavota, Abdallah Qusef, Rocco Oliveto, Andrea De Lucia, and Dave Binkley. 2015. Are test smells really harmful? an empirical study. Empirical Software Engineering 20, 4 (2015), 1052–1094.
[2]
Jonathan Immanuel Brachthäuser, Sukyoung Ryu, Nathaniel Nystrom, Jonas De Bleser, Dario Di Nucci, and Coen De Roover. 2019. SoCRATES: Scala radar for test smells. Proceedings of the Tenth ACM SIGPLAN Symposium on Scala (2019), 22–26. https://doi.org/10.1145/3337932.3338815
[3]
Everton Cavalcante, Francisco Dantas, Thais Batista, Elvys Soares, Márcio Ribeiro, Guilherme Amaral, Rohit Gheyi, Leo Fernandes, Alessandro Garcia, Baldoino Fonseca, and André Santos. 2020. Refactoring Test Smells: A Perspective from Open-Source Developers. Proceedings of the 5th SAST (2020), 50–59. https://doi.org/10.1145/3425174.3425212
[4]
Jonas De Bleser, Dario Di Nucci, and Coen De Roover. 2019. Assessing diffusion and perception of test smells in scala projects. In 2019 IEEE/ACM 16th International Conference on Mining Software Repositories (MSR). IEEE, 457–467.
[5]
Prem Devanbu, Myra Cohen, Thomas Zimmermann, Anthony Peruma, Khalid Almalki, Christian D Newman, Mohamed Wiem Mkaouer, Ali Ouni, and Fabio Palomba. 2020. tsDetect: an open source test smells detection tool. Proceedings of the 28th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering (2020), 1650–1654. https://doi.org/10.1145/3368089.3417921
[6]
Joseph L Fleiss. 1971. Measuring nominal scale agreement among many raters.Psychological bulletin 76, 5 (1971), 378.
[7]
Mário Hozano, Alessandro Garcia, Baldoino Fonseca, and Evandro Costa. 2018. Are you smelling it? Investigating how similar developers detect code smells. Information and Software Technology 93 (2018), 130–146.
[8]
Nildo Silva Junior, Larissa Rocha, Luana Almeida Martins, and Ivan Machado. 2020. A survey on test practitioners’ awareness of test smells. arXiv preprint arXiv:2003.05613 (2020).
[9]
J Richard Landis and Gary G Koch. 1977. The measurement of observer agreement for categorical data. biometrics (1977), 159–174.
[10]
Mika V Mantyla. 2005. An experiment on subjective evolvability evaluation of object-oriented software: explaining factors and interrater agreement. In 2005 International Symposium on Empirical Software Engineering, 2005. IEEE, 10–pp.
[11]
Mika V Mäntylä and Casper Lassenius. 2006. Subjective evaluation of software evolvability using code smells: An empirical study. Empirical Software Engineering 11, 3 (2006), 395–431.
[12]
Luana Martins, Heitor Costa, and Ivan Machado. 2023. On the diffusion of test smells and their relationship with test code quality of Java projects. Journal of Software: Evolution and Process (2023). https://doi.org/10.1002/smr.2532
[13]
Anthony Peruma, Khalid Almalki, Christian D Newman, Mohamed Wiem Mkaouer, Ali Ouni, and Fabio Palomba. 2020. Tsdetect: An open source test smells detection tool. In Proceedings of the 28th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering. 1650–1654.
[14]
Anthony Peruma, Khalid Saeed Almalki, Christian D Newman, Mohamed Wiem Mkaouer, Ali Ouni, and Fabio Palomba. 2019. On the distribution of test smells in open source android applications: An exploratory study. (2019).
[15]
Brittany Reid, Markus Wagner, Marcelo d’Amorim, and Christoph Treude. 2022. Software Engineering User Study Recruitment on Prolific: An Experience Report. arXiv preprint arXiv:2201.05348 (2022).
[16]
Railana Santana, Daniel Fernandes, Denivan Campos, Larissa Soares, Rita Maciel, and Ivan Machado. 2021. Understanding practitioners’ strategies to handle test smells: a multi-method study. Brazilian Symposium on Software Engineering (2021), 49–53. https://doi.org/10.1145/3474624.3474639
[17]
Railana Santana, Daniel Fernandes, Denivan Campos, Larissa Soares, Rita Maciel, and Ivan Machado. 2021. Understanding practitioners’ strategies to handle test smells: a multi-method study. In Brazilian Symposium on Software Engineering. 49–53.
[18]
Railana Santana, Luana Martins, Tássio Virgínio, Larissa Soares, Heitor Costa, and Ivan Machado. 2022. Refactoring Assertion Roulette and Duplicate Assert test smells: a controlled experiment. arXiv (2022). https://doi.org/10.48550/arxiv.2207.05539 arXiv:2207.05539
[19]
C.B. Seaman. 1999. Qualitative methods in empirical studies of software engineering. IEEE Transactions on Software Engineering 25, 4 (1999), 557–572. https://doi.org/10.1109/32.799955
[20]
Elvys Soares, Márcio Ribeiro, Guilherme Amaral, Rohit Gheyi, Leo Fernandes, Alessandro Garcia, Baldoino Fonseca, and André Santos. 2020. Refactoring test smells: A perspective from open-source developers. In Proceedings of the 5th Brazilian Symposium on Systematic and Automated Software Testing. 50–59.
[21]
Elvys Soares, Marcio Ribeiro, Rohit Gheyi, Guilherme Amaral, and Andre Medeiros Santos. 2022. Refactoring Test Smells With JUnit 5: Why Should Developers Keep Up-to-Date. IEEE Transactions on Software Engineering PP, 99 (2022), 1–1. https://doi.org/10.1109/tse.2022.3172654
[22]
Davide Spadini, Fabio Palomba, Andy Zaidman, Magiel Bruntink, and Alberto Bacchelli. 2018. On the relation of test smells to software code quality. In 2018 IEEE international conference on software maintenance and evolution (ICSME). IEEE, 1–12.
[23]
Arie Van Deursen, Leon Moonen, Alex Van Den Bergh, and Gerard Kok. 2001. Refactoring test code. In Proceedings of the 2nd international conference on extreme programming and flexible processes in software engineering (XP2001). Citeseer, 92–95.
[24]
Tássio Virgínio, Luana Almeida Martins, Larissa Rocha Soares, Railana Santana, Heitor Costa, and Ivan Machado. 2020. An empirical study of automatically-generated tests from the perspective of test smells. In Proceedings of the 34th Brazilian Symposium on Software Engineering. 92–96.
[25]
Tássio Virgínio, Luana Martins, Railana Santana, Adriana Cruz, Larissa Rocha, Heitor Costa, and Ivan Machado. 2021. On the test smells detection: an empirical study on the JNose Test accuracy. Journal of Software Engineering Research and Development 9 (2021). https://doi.org/10.5753/jserd.2021.1893
[26]
Claes Wohlin, Per Runeson, Martin Hst, Magnus C. Ohlsson, Bjrn Regnell, and Anders Wessln. 2012. Experimentation in Software Engineering. Springer Publishing Company, Incorporated.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
SBQS '23: Proceedings of the XXII Brazilian Symposium on Software Quality
November 2023
391 pages
ISBN:9798400707865
DOI:10.1145/3629479
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 December 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Empirical Study
  2. Human Factors
  3. Open Source
  4. Test Smells

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • FAPEAL
  • CAPES
  • CNPQ
  • PRONEX
  • FACEPE

Conference

SBQS '23
SBQS '23: XXII Brazilian Symposium on Software Quality
November 7 - 10, 2023
Bras\'{\i}lia, Brazil

Acceptance Rates

Overall Acceptance Rate 35 of 99 submissions, 35%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 28
    Total Downloads
  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)2
Reflects downloads up to 02 Sep 2024

Other Metrics

Citations

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media