Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3593434.3593465acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

Aggregating N-fold Requirements Inspection Results

Published: 14 June 2023 Publication History

Abstract

Requirements validation is an important aspect for ensuring high quality software. Commonly used are requirements inspections, where the specification is read from different persons assuming different roles or applying different reading techniques, partly accompanied by checklists. Actual defect detection with requirements inspection is costly, and defect detection rates must be considered low. Therefore, repeated validation is used or validation with multiple inspection groups - known as N-fold inspections. However, this does not only yield more defects found, but also more false positives. In this paper, we investigate how defect aggregation can be used to improve the overall quality of validation. Therefore, we conducted an experiment with 22 N-fold inspection groups consisting of four to five reviewers each. Results show that simple aggregation of all results leads to a number of false positives that can actually negatively impact the validation task, while the use of more tailored aggregation strategies can considerably improve the validation of requirements with N-fold inspections.

References

[1]
Aybuke Aurum, Håkan Petersson, and Claes Wohlin. 2002. State-of-the-art: software inspections after 25 years. Software Testing, Verification and Reliability 12, 3 (2002), 133–154.
[2]
Victor R. Basili, Scott Green, Oliver Laitenberger, Filippo Lanubile, Forrest Shull, Lars Sivert Sørumgård, and Marvin V. Zelkowitz. 1996. The Empirical Investigation of Perspective-Based Reading. Empirical Software Engineering 1, 2 (1996), 133–164.
[3]
Victor R. Basili, Scott Green, Oliver Laitenberger, Filippo Lanubile, Forrest Shull, Sivert Sørumgård, and Marvin V. Zelkowitz. 1996. The Empirical Investigation of Perspective-Based Reading. Empirical Software Engineering 1, 2 (1996), 133–164.
[4]
Gabriele Bavota, Carmine Gravino, Rocco Oliveto, Andrea De Lucia, Genoveffa Tortora, Marcela Genero, and José Antonio Cruz-Lemus. 2011. Identifying the Weaknesses of UML Class Diagrams during Data Model Comprehension. In Model Driven Engineering Languages and Systems(Lecture Notes in Computer Science), Jon Whittle, Tony Clark, and Thomas Kühne (Eds.). Springer, Berlin, Heidelberg, 168–182.
[5]
Tomas Berling and Per Runeson. 2003. Evaluation of a perspective based review method applied in an industrial setting. IEE Proceedings - Software 150, 3 (2003), 177–184.
[6]
Barry Boehm. 1981. Software engineering economics. New York 197 (1981).
[7]
Narciso Cerpa and June M. Verner. 2009. Why Did Your Project Fail?Commun. ACM 52, 12 (Dec. 2009), 130–134.
[8]
Reidar Conradi, Parastoo Mohagheghi, Tayyaba Arif, Lars Christian Hegde, Geir Arne Bunde, and Anders Pedersen. 2003. Object-Oriented Reading Techniques for Inspection of UML Models – An Industrial Experiment. In ECOOP 2003 – Object-Oriented Programming(Lecture Notes in Computer Science), Luca Cardelli (Ed.). Springer, Berlin, Heidelberg, 483–500.
[9]
R. M. d. Mello, E. N. Teixeira, M. Schots, C. M. L. Werner, and G. H. Travassos. 2012. Checklist-Based Inspection Technique for Feature Models Review. In 2012 Sixth Brazilian Symposium on Software Components, Architectures and Reuse. 140–149.
[10]
Marian Daun, Jennifer Brings, Lisa Krajinski, and Thorsten Weyer. 2019. On the benefits of using dedicated models in validation processes for behavioral specifications. In Int. Conf. on Software and System Processes. IEEE / ACM, 44–53.
[11]
Marian Daun, Jennifer Brings, Patricia Aluko Obe, and Viktoria Stenkova. 2021. Reliability of self-rated experience and confidence as predictors for students’ performance in software engineering: Results from multiple controlled experiments on model comprehension with graduate and undergraduate students. Empirical Software Engineering 26, 4 (2021), 80.
[12]
Marian Daun, Jennifer Brings, and Thorsten Weyer. 2017. On the Impact of the Model-Based Representation of Inconsistencies to Manual Reviews - Results from a Controlled Experiment. In Conceptual Modeling - 36th Int. Conf.466–473.
[13]
Marian Daun, Jennifer Brings, and Thorsten Weyer. 2020. Do Instance-level Review Diagrams Support Validation Processes of Cyber-Physical System Specifications: Results from a Controlled Experiment. In Int. Conf. on Software and System Processes. ACM, 10 pages.
[14]
Marian Daun, Thorsten Weyer, and Klaus Pohl. 2019. Improving manual reviews in function-centered engineering of embedded systems using a dedicated review model. Software and Systems Modeling 18, 6 (2019), 3421–3459.
[15]
J. R. de Almeida, J. B. Camargo, B. A. Basseto, and S. M. Paz. 2003. Best practices in code inspection for safety-critical software. IEEE Software 20, 3 (2003), 56–63.
[16]
E. P. Doolan. 1992. Experience with Fagan’s inspection method. Software: Practice and Experience 22, 2 (1992), 173–182.
[17]
M. E. Fagan. 1976. Design and Code Inspections to Reduce Errors in Program Development. IBM Syst. J. 15, 3 (Sept. 1976), 182–211.
[18]
T. Gilb and D. Graham. 1993. Software Inspection. Addison-Wesley, Wokingham, England.
[19]
L. Hatton. 2008. Testing the Value of Checklists in Code Inspections. IEEE Software 25, 04 (jul 2008), 82–88.
[20]
Lulu He and Jeffrey C. Carver. 2006. PBR vs. checklist: a replication in the n-fold inspection context. In 2006 International Symposium on Empirical Software Engineering (ISESE 2006), September 21-22, 2006, Rio de Janeiro, Brazil, Guilherme Horta Travassos, José Carlos Maldonado, and Claes Wohlin (Eds.). ACM, 95–104.
[21]
M. Höst, B. Regnell, and C. Wohlin. 2000. Using Students as Subjects-A Comparative Study of Students and Professionals in Lead-Time Impact Assessment. Empirical Software Engineering 5, 3 (2000), 201–214.
[22]
International Standardization Organization. 2011. ISO 26262: Road Vehicles : Functional Safety. ISO.
[23]
IREB. 2020. Certified Professional for Requirements Engineering Foundation Level Syllabus v3.0.1. Technical Report. Int. Requirements Engineering Board e.V.
[24]
Eliezer Kantorowitz, Arie Guttman, and Lior Arzi. 1997. The performance of the N-Fold requirement inspection method. Requirements Engineering 2, 3 (1997), 152–164.
[25]
John C. Knight and E. Ann Myers. 1993. An Improved Inspection Technique. Commun. ACM 36, 11 (Nov. 1993), 51–61.
[26]
Sami Kollanus and Jussi Koskinen. 2009. Survey of software inspection research. The Open Software Engineering Journal 3, 1 (2009).
[27]
O. Laitenberger. 1998. Studying the effects of code inspection and structural testing on software quality. In Proceedings Ninth International Symposium on Software Reliability Engineering (Cat. No.98TB100257). 237–246.
[28]
Oliver Laitenberger, Colin Atkinson, Maud Schlich, and Khaled El Emam. 2000. An experimental comparison of reading techniques for defect detection in UML design documents. Journal of Systems and Software 53, 2 (Aug. 2000), 183–204.
[29]
Oliver Laitenberger, Khaled El Emam, and Thomas G. Harbich. 2001. An Internally Replicated Quasi-Experimental Comparison of Checklist and Perspective-Based Reading of Code Documents. IEEE Trans. Software Eng. 27, 5 (2001), 387–421.
[30]
A. D. Lucia, C. Gravino, R. Oliveto, and G. Tortora. 2008. Data Model Comprehension: An Empirical Comparison of ER and UML Class Diagrams. In 2008 16th IEEE International Conference on Program Comprehension. 93–102.
[31]
José Carlos Maldonado, Jeffrey Carver, Forrest Shull, Sandra Camargo Pinto Ferraz Fabbri, Emerson Dória, Luciana Andréia Fondazzi Martimiano, Manoel G. Mendonça, and Victor R. Basili. 2006. Perspective-Based Reading: A Replicated Experiment Focused on Individual Reviewer Effectiveness. Empirical Software Engineering 11, 1 (2006), 119–142.
[32]
Johnny Martin and W. T. Tsai. 1990. N-Fold Inspection: A Requirements Analysis Technique. Commun. ACM 33, 2 (Feb. 1990), 225–232.
[33]
James Miller, Murray Wood, and Marc Roper. 1998. Further Experiences with Scenarios and Checklists. Empirical Software Engineering 3, 1 (1998), 37–64.
[34]
R. O.Oladele and H. O. Adedayo. 2014. On Empirical Comparison of Checklist-based Reading and Adhoc Reading for Code Inspection. International Journal of Computer Applications 87, 1 (Feb. 2014), 35–39.
[35]
David L. Parnas and David M. Weiss. 1985. Active Design Reviews: Principles and Practices. In Proceedings of the 8th International Conference on Software Engineering(ICSE ’85). IEEE Computer Society Press, Washington, DC, USA, 132–136.
[36]
A. A. Porter, H. P. Siy, C. A. Toman, and L. G. Votta. 1997. An experiment to assess the cost-benefits of code inspections in large scale software development. IEEE Transactions on Software Engineering 23, 6 (1997), 329–346.
[37]
Adam A. Porter and Lawrence G. Votta. 1998. Comparing Detection Methods For Software Requirements Inspections: A Replication Using Professional Subjects. Empirical Software Engineering 3, 4 (1998), 355–379.
[38]
Adam A. Porter, Lawrence G. Votta, and Victor R. Basili. 1995. Comparing Detection Methods for Software Requirements Inspections: A Replicated Experiment. IEEE Trans. Software Eng. 21, 6 (1995), 563–575.
[39]
Giedre Sabaliauskaite, Shinji Kusumoto, and Katsuro Inoue. 2004. Assessing defect detection performance of interacting teams in object-oriented design inspection. Information & Software Technology 46, 13 (2004), 875–886.
[40]
SAE. 1996. International Standard 4761: Guidelines and Methods for Conducting the Safety Assessment Process on Civil Airborne Systems and Equipment.
[41]
S. Saito, M. Takeuchi, M. Hiraoka, T. Kitani, and M. Aoyama. 2013. Requirements clinic: Third party inspection methodology and practice for improving the quality of software requirements specifications. In 2013 21st IEEE International Requirements Engineering Conference (RE). 290–295.
[42]
C. Sauer, D. R. Jeffery, L. Land, and P. Yetton. 2000. The effectiveness of software development technical reviews: a behaviorally motivated program of research. IEEE Transactions on Software Engineering 26, 1 (2000), 1–14.
[43]
G. Michael Schneider, Johnny Martin, and W. T. Tsai. 1992. An Experimental Study of Fault Detection in User Requirements Documents. ACM Trans. Softw. Eng. Methodol. 1, 2 (April 1992), 188–204.
[44]
Karl Wiegers and Joy Beatty. 2013. Software requirements. Pearson Education.
[45]
C. Wohlin, P. Runeson, M. Höst, M.C. Ohlsson, B. Regnell, and A. Wesslén. 2000. Experimentation in software engineering: An introduction. Kluwer international series in software engineering, Vol. 6. Kluwer Academic, Boston, Mass.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
EASE '23: Proceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering
June 2023
544 pages
ISBN:9798400700446
DOI:10.1145/3593434
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. requirements inspection
  2. software inspection
  3. validation

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

EASE '23

Acceptance Rates

Overall Acceptance Rate 71 of 232 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 31
    Total Downloads
  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)1
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media