Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2568225.2568278acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Code coverage for suite evaluation by developers

Published: 31 May 2014 Publication History

Abstract

One of the key challenges of developers testing code is determining a test suite's quality -- its ability to find faults. The most common approach is to use code coverage as a measure for test suite quality, and diminishing returns in coverage or high absolute coverage as a stopping rule. In testing research, suite quality is often evaluated by a suite's ability to kill mutants (artificially seeded potential faults). Determining which criteria best predict mutation kills is critical to practical estimation of test suite quality. Previous work has only used small sets of programs, and usually compares multiple suites for a single program. Practitioners, however, seldom compare suites --- they evaluate one suite. Using suites (both manual and automatically generated) from a large set of real-world open-source projects shows that evaluation results differ from those for suite-comparison: statement (not block, branch, or path) coverage predicts mutation kills best.

References

[1]
P. Ammann and J. Offutt. Introduction to software testing. Cambridge University Press, 2008.
[2]
J. H. Andrews, L. C. Briand, and Y. Labiche. Is mutation an appropriate tool for testing experiments? In Proceedings of the 27th International Conference on Software Engineering, pages 402–411. IEEE, 2005.
[3]
J. H. Andrews, L. C. Briand, Y. Labiche, and A. S. Namin. Using mutation analysis for assessing and comparing testing coverage criteria. IEEE Transactions on Software Engineering, 32(8):608–624, 2006.
[4]
Apache Software Foundation. Apache maven project. http://maven.apache.org.
[5]
T. Ball. A theory of predicate-complete test coverage and generation, 2004. Technical report, Microsoft Research Technical Report MSR-TR-2004-28.
[6]
T. Ball. A theory of predicate-complete test coverage and generation. In Formal Methods for Components and Objects, pages 1–22. Springer, 2005.
[7]
T. A. Budd, R. J. Lipton, R. A. DeMillo, and F. G. Sayward. Mutation analysis. Yale University, Department of Computer Science, 1979.
[8]
X. Cai and M. R. Lyu. The effect of code coverage on fault detection under different testing profiles. In ACM SIGSOFT Software Engineering Notes, volume 30, pages 1–7. ACM, 2005.
[9]
J. J. Chilenski. An investigation of three forms of the modified condition decision coverage (MCDC) criterion. Technical report, DTIC Document, 2001.
[10]
H. Coles. Pit mutation testing. http://pittest.org/.
[11]
M. Doliner and Others. Cobertura - a code coverage utility for java. http://cobertura.github.io/cobertura.
[12]
P. G. Frankl and S. N. Weiss. An experimental comparison of the effectiveness of branch testing and data flow testing. IEEE Transactions on Software Engineering, 19:774–787, 1993.
[13]
P. G. Frankl, S. N. Weiss, and C. Hu. All-uses vs mutation testing: an experimental comparison of effectiveness. Journal of Systems and Software, 38(3):235–253, 1997.
[14]
GitHub Inc. Github languages. http://www.github.com/languages.
[15]
GitHub Inc. Software repository. http://www.github.com.
[16]
M. Gligoric, A. Groce, C. Zhang, R. Sharma, M. A. Alipour, and D. Marinov. Comparing non-adequate test suites using coverage criteria. In ACM International Symposium on Software Testing and Analysis. ACM, 2013.
[17]
R. Gopinath. Replication data for: Code coverage for suite evaluation by developers. In http: // dx. doi. org/ 10. 7910/ DVN/ 24574. Harvard Dataverse Network V1, 2014-01.
[18]
A. Gupta and P. Jalote. An approach for experimentally evaluating effectiveness and efficiency of coverage criteria for software testing. International Journal on Software Tools for Technology Transfer, 10(2):145–160, 2008.
[19]
M. M. Hassan and J. H. Andrews. Comparing multi-point stride coverage and dataflow coverage. In Proceedings of the 2013 International Conference on Software Engineering, pages 172–181. IEEE Press, 2013.
[20]
M. Hutchins, H. Foster, T. Goradia, and T. Ostrand. Experiments of the effectiveness of dataflow-and controlflow-based test adequacy criteria. In Proceedings of the 16th international conference on Software engineering, pages 191–200. IEEE Computer Society Press, 1994.
[21]
L. Inozemtseva and R. Holmes. Coverage is not strongly correlated with test suite effectiveness. In Proceedings of the 2014 International Conference on Software Engineering, 2014.
[22]
L. M. M. Inozemtseva. Predicting test suite effectiveness for java programs. Master’s thesis, University of Waterloo, 2012.
[23]
S. Kakarla. An analysis of parameters influencing test suite effectiveness. Master’s thesis, Texas Tech University, 2010.
[24]
N. Li, U. Praphamontripong, and J. Offutt. An experimental comparison of four unit test criteria: Mutation, edge-pair, all-uses and prime path coverage. In International Conference on Software Testing, Verification and Validation Workshops, 2009. ICSTW’09., pages 220–229. IEEE, 2009.
[25]
R. Liesenfeld. JMockit - A developer testing toolkit for Java. http://code.google.com/p/jmockit/.
[26]
T. J. McCabe. A complexity measure. IEEE Transactions on Software Engineering, (4):308–320, 1976.
[27]
G. J. Myers. The art of software testing. A Willy-Interscience Pub, 1979.
[28]
A. S. Namin and J. H. Andrews. The influence of size and coverage on test suite effectiveness. In Proceedings of the eighteenth international symposium on Software testing and analysis, pages 57–68. ACM, 2009.
[29]
A. J. Offutt and J. M. Voas. Subsumption of condition coverage techniques by mutation testing. Technical report, 1996.
[30]
C. Pacheco and M. D. Ernst. Randoop random test generation. http://code.google.com/p/randoop.
[31]
C. Pacheco, S. K. Lahiri, M. D. Ernst, and T. Ball. Feedback-directed random test generation. In Proceedings of the 29th International Conference on Software Engineering, pages 75–84. IEEE, 2007.
[32]
V. Roubtsov and Others. Emma - a free java code coverage tool. http://emma.sourceforge.net/.
[33]
R. Schmidberger and Others. Codecover - an open-source glass-box testing tool. http://codecover.org/.
[34]
TIOBE. Tiobe index. http://www.tiobe.com/index. php/content/paperinfo/tpci/index.html.
[35]
Y. Wei, B. Meyer, and M. Oriol. Is branch coverage a good measure of testing effectiveness? In Empirical Software Engineering and Verification, pages 194–212. Springer, 2012.

Cited By

View all
  • (2025)Subsumption, correctness and relative correctness: Implications for software testingScience of Computer Programming10.1016/j.scico.2024.103177239(103177)Online publication date: Jan-2025
  • (2025)Testability-driven developmentComputer Standards & Interfaces10.1016/j.csi.2024.10387791:COnline publication date: 1-Jan-2025
  • (2024)Mutation Testing as a Quality Assurance Technique in a Fintech Company: An Experience Report in The IndustryProceedings of the XXIII Brazilian Symposium on Software Quality10.1145/3701625.3701629(446-451)Online publication date: 5-Nov-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE 2014: Proceedings of the 36th International Conference on Software Engineering
May 2014
1139 pages
ISBN:9781450327565
DOI:10.1145/2568225
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

In-Cooperation

  • TCSE: IEEE Computer Society's Tech. Council on Software Engin.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 31 May 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. evaluation of coverage criteria
  2. statistical analysis
  3. test frameworks

Qualifiers

  • Research-article

Conference

ICSE '14
Sponsor:

Acceptance Rates

Overall Acceptance Rate 276 of 1,856 submissions, 15%

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)104
  • Downloads (Last 6 weeks)6
Reflects downloads up to 09 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Subsumption, correctness and relative correctness: Implications for software testingScience of Computer Programming10.1016/j.scico.2024.103177239(103177)Online publication date: Jan-2025
  • (2025)Testability-driven developmentComputer Standards & Interfaces10.1016/j.csi.2024.10387791:COnline publication date: 1-Jan-2025
  • (2024)Mutation Testing as a Quality Assurance Technique in a Fintech Company: An Experience Report in The IndustryProceedings of the XXIII Brazilian Symposium on Software Quality10.1145/3701625.3701629(446-451)Online publication date: 5-Nov-2024
  • (2024)Software Development Practices and Tools for University-Industry R&D projectsProceedings of the XXIII Brazilian Symposium on Software Quality10.1145/3701625.3701627(426-437)Online publication date: 5-Nov-2024
  • (2024)An Empirical Study on Code Coverage of Performance TestingProceedings of the 28th International Conference on Evaluation and Assessment in Software Engineering10.1145/3661167.3661196(48-57)Online publication date: 18-Jun-2024
  • (2024)Planning to Guide LLM for Code Coverage PredictionProceedings of the 2024 IEEE/ACM First International Conference on AI Foundation Models and Software Engineering10.1145/3650105.3652292(24-34)Online publication date: 14-Apr-2024
  • (2024)Mutating Matters: Analyzing the Influence of Mutation Testing in Programming CoursesProceedings of the 2024 on ACM Virtual Global Computing Education Conference V. 110.1145/3649165.3690110(151-157)Online publication date: 5-Dec-2024
  • (2024)Productive Coverage: Improving the Actionability of Code CoverageProceedings of the 46th International Conference on Software Engineering: Software Engineering in Practice10.1145/3639477.3639733(58-68)Online publication date: 14-Apr-2024
  • (2024)Concretely Mapped Symbolic Memory Locations for Memory Error DetectionIEEE Transactions on Software Engineering10.1109/TSE.2024.3395412(1-21)Online publication date: 2024
  • (2024)Detecting Faults vs. Revealing Failures: Exploring the Missing Link2024 IEEE 24th International Conference on Software Quality, Reliability and Security (QRS)10.1109/QRS62785.2024.00021(115-126)Online publication date: 1-Jul-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media