Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3195538.3195542acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Measuring and improving testability of system requirements in an industrial context by applying the goal question metric approach

Published: 02 June 2018 Publication History

Abstract

Testing is subject to two basic constraints, namely cost and quality. The cost depends on the efficiency of the testing activities as well as their quality and testability. The author's practical experience in large-scale systems shows that if the requirements are adapted iteratively or the architecture is altered, testability decreases. However, what is often lacking is a root cause analysis of the testability degradations and the introduction of improvement measures during software development. In order to introduce agile practices in the rigid strategy of the V-model, good testability of software artifacts is vital. So testability is also the bridgehead towards agility. In this paper, we report on a case study in which we measure and improve testability on the basis of the Goal Question Metric Approach.

References

[1]
ISTQB. Standard glossary of terms used in software testing, Version 3.1, Internat. Software Testing Qualifications Board, Glossary Working Party, 2016.
[2]
A. Beer, R. Ramler, The role of experience in software testing practice, Proceedings of the 34th Euromicro Conf. Softw. Eng. and Advanced Applications: 258--265, IEEE 2008.
[3]
M. Felderer, A. Beer, B. Peischl, "On the role of defect taxonomy types for testing requirements: Results of a Controlled Experiment", Proceedings of the 40th Euromicro Conf. Softw. Eng. and Advanced Applications, pp. 377--384, IEEE 2014.
[4]
H. Femmer et al., "Rapid requirements checks with requirement smells", Journal of Systems and Software, Elsevier 2017.
[5]
Defense Standard 00-54, Requirements for safety related electronic hardware in defense equipment, UK Ministry of Defense, 1999.
[6]
IEEE Standard Computer Dictionary, 1990.
[7]
Dvorak D.L., Eds., NASA study on flight software complexity, Jet Propulsion Laboratory, California Institute of Technology, 2001.
[8]
Wikipedia, retrieval Dec.28, 2018
[9]
Kan St., Metrics and models in software quality engineering, Addison-Wesley, pp. 311--330, 2006.
[10]
Grottke M., Dussa-Zieger K.; Prediction of software failures based on systematic testing, EuroSTAR, Stockholm, 2001.
[11]
Alves, N. S. R., Ribeiro L.F., Caires V., Mendes T. S., Spinola R. O., Towards an ontology of terms of technical debt. Proc. IEEE 6th Int. Workshop on Managing Technical Debt, pp 1--7, 2014.
[12]
IEEE Standard Computer Dictionary, 1990.
[13]
Sulaman M.S., et al, Comparison of the FMEA and STPA safety analysis methods - a case study, Softw. Quality Journal, Springer, 2017. (online first)
[14]
Egyed A.; Tailoring Software Traceability to value-based needs. In Biffl St. et al. (Eds.): Value-based software engineering, pp.287--308. Springer, 2010.
[15]
Felderer M., Ramler R., A multiple case-study on risk-based testing in industry, Int. Journal on Software Tools for Technology Transfer 16(5), Springer pp 609--625, 2014.
[16]
Jungmayr S., Testability measurement and software dependencies, Proceedings of the 12th International Workshop on Software Measurement. Vol. 25. No. 9. 2002.
[17]
Basili V.R., Rombach H.D., "The TAME project: Towards improvement-oriented software environments", IEEE Trans. on SW Engineering, Vol.SE-11, pp 758--773, 1988.
[18]
Felderer M., Beer A., Ho J., Ruhe G., Industrial evaluation of the impact of quality driven release planning, ESEM '14, Torino, Italy, 2014.
[19]
Binder R. V., Design for testability in object-oriented systems, Comm. of the ACM, Sept. 1994, Vol.37, No.9, pp 89--101, 1994.
[20]
Van Solingen R, Berghout E., The goal/question metric method: a practical guide for quality improvement of software development; McGraw Hill, London, 1999.
[21]
Runeson P, Höst M., Guidelines for conducting and reporting case study research in software engineering; Empirical Softw. Engg. 14,2 (April 2009), 131--164, 2009.
[22]
Felderer M., Beer A., Using defect taxonomies for requirements validation in industrial projects; RE 2013, IEEE 2013, 296--301, 2013.
[23]
Bjarnason, E., Runeson, P., Borg, M., Unterkalmsteiner, M., Engström, E., Regnell, B., Sabaliauskaite, G., Loconsole, A., Gorschek, T., Feldt, R., Challenges and practices in aligning requirements with verification and validation: a case study of six companies, Empirical Software Engineering, 19(6), 1809--1855, 2014.
[24]
Garousi, V., Felderer, M., Kilicaslan, F.N., What we know about software testability: a survey, 2018, arxiv, CoRR abs/1801.02201

Cited By

View all
  • (2024)Natural language requirements testability measurement based on requirement smellsNeural Computing and Applications10.1007/s00521-024-09730-x36:21(13051-13085)Online publication date: 1-Jul-2024
  • (2023)Systematic Literature Review on Test Case Quality Characteristics and Metrics2023 3rd International Conference on Emerging Smart Technologies and Applications (eSmarTA)10.1109/eSmarTA59349.2023.10293544(01-08)Online publication date: 10-Oct-2023
  • (2023)Naive Bayes Classification Model for Precondition-Postcondition in Software Requirements2023 International Conference on Data Science and Its Applications (ICoDSA)10.1109/ICoDSA58501.2023.10277397(123-128)Online publication date: 9-Aug-2023
  • Show More Cited By

Index Terms

  1. Measuring and improving testability of system requirements in an industrial context by applying the goal question metric approach

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    RET '18: Proceedings of the 5th International Workshop on Requirements Engineering and Testing
    June 2018
    54 pages
    ISBN:9781450357494
    DOI:10.1145/3195538
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 June 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. complexity
    2. empirical study
    3. requirements quality
    4. requirements-based testing
    5. software quality
    6. system testing
    7. testability
    8. traceability

    Qualifiers

    • Research-article

    Funding Sources

    • Knowledge Foundation (KKS) of Sweden

    Conference

    ICSE '18
    Sponsor:

    Upcoming Conference

    ICSE 2025

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)19
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 08 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Natural language requirements testability measurement based on requirement smellsNeural Computing and Applications10.1007/s00521-024-09730-x36:21(13051-13085)Online publication date: 1-Jul-2024
    • (2023)Systematic Literature Review on Test Case Quality Characteristics and Metrics2023 3rd International Conference on Emerging Smart Technologies and Applications (eSmarTA)10.1109/eSmarTA59349.2023.10293544(01-08)Online publication date: 10-Oct-2023
    • (2023)Naive Bayes Classification Model for Precondition-Postcondition in Software Requirements2023 International Conference on Data Science and Its Applications (ICoDSA)10.1109/ICoDSA58501.2023.10277397(123-128)Online publication date: 9-Aug-2023
    • (2023)A Qualitative and Comprehensive Analysis of Software Testability Metrics and their Trends2023 Second International Conference on Electronics and Renewable Systems (ICEARS)10.1109/ICEARS56392.2023.10085333(1545-1552)Online publication date: 2-Mar-2023
    • (2023)Verifying Agile Black-Box Test Case Quality Measurements: Expert ReviewIEEE Access10.1109/ACCESS.2023.332057611(106987-107003)Online publication date: 2023
    • (2022)An empirical study on maintainable method size in JavaProceedings of the 19th International Conference on Mining Software Repositories10.1145/3524842.3527975(252-264)Online publication date: 23-May-2022
    • (2022)Trends and Findings in Measuring Software Quality Metrics in the Industry2022 IEEE Biennial Congress of Argentina (ARGENCON)10.1109/ARGENCON55245.2022.9939935(1-8)Online publication date: 7-Sep-2022
    • (2022)An initial investigation of the effect of quality factors on Agile test case quality through experts’ reviewCogent Engineering10.1080/23311916.2022.20821219:1Online publication date: 6-Jun-2022
    • (2021)Challenges of Software Requirements Quality Assurance and Validation: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2021.31179899(137613-137634)Online publication date: 2021
    • (2019)Research on Spacecraft Rapid Test Technology Based on Built-in Self-test2019 14th IEEE International Conference on Electronic Measurement & Instruments (ICEMI)10.1109/ICEMI46757.2019.9101568(1593-1598)Online publication date: Nov-2019

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media