Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Investigation of individual factors impacting the effectiveness of requirements inspections: a replicated experiment

Published: 01 February 2014 Publication History

Abstract

This paper presents a replication of an empirical study regarding the impact of individual factors on the effectiveness of requirements inspections. Experimental replications are important for verifying results and investigating the generality of empirical studies. We utilized the lab package and procedures from the original study, with some changes and additions, to conduct the replication with 69 professional developers in three different companies in Turkey. In general the results of the replication were consistent with those of the original study. The main result from the original study, which is supported in the replication, was that inspectors whose degree is in a field related to software engineering are less effective during a requirements inspection than inspectors whose degrees are in other fields. In addition, we found that Company, Experience, and English Proficiency impacted inspection effectiveness.

References

[1]
Aceituna D, Do H, Walia GS, Lee S-W (2011) Evaluating the use of model-based requirements verification method: A feasibility study. In: Proc. 2011 First International Workshop on Empirical Requirements Engineering (EmpiRE), pp. 13-20.
[2]
Andersson C (2007) A replicated empirical study of a selection method for software reliability growth models. Empir Softw Eng 12(2):161-182.
[3]
Aurum A, Petersson H, Wohlin C (2002) State-of-the-art: software inspections after 25 years. Soft Test Verif Rel 12(3):133-154.
[4]
Aurum A, Wohlin C, Petersson H (2005) Increasing the understanding of effectiveness in software inspections using published data sets. J Res Pract Inf Technol 37(3):253-266.
[5]
Basili VR, Green S, Laitenberger O, Lanubile F, Shull F, Sørumgård S, Zelkowitz MV (1996) The empirical investigation of perspective-based reading. Empir Softw Eng 1(2):133-164.
[6]
Biffl S (2000) Analysis of the impact of reading technique and inspector capability on individual inspection performance. In: Proc.7th Asia-Pacific Softw. Eng. Conf, pp. 136-145.
[7]
Carver J (2003) The Impact of Background and Experience on Software Inspections. PhD Thesis. Dept. of Comp. Sci., Univ. of MD.
[8]
Carver J (2004) The impact of background and experience on software inspections. Empir Softw Eng 9 (3):259-262.
[9]
Carver J, Lemon K (2005) Architecture reading techniques: A feasibility study. In: Proc.4th Int'l Symp. on Emp. Softw. Eng. (Late Breaking Research Track). pp. 17-20.
[10]
Carver J, Shull F, Basili V (2003) Observational studies to accelerate process experience in classroom studies: An evaluation. In: Proc.2nd Int'l Symp. on Emp. Softw. Eng., pp. 72-79.
[11]
Carver J, Shull F, Basili VR (2006) Can observational techniques help novices overcome the software inspection learning curve? an empirical investigation. Empir Softw Eng 11(4):523-539.
[12]
Carver JC, Nagappan N, Page A (2008) The impact of educational background on the effectiveness of requirements inspections: an empirical study. IEEE Trans Softw Eng 34(6):800-812.
[13]
Ciolkowski M (2009) What do we know about perspective-based reading? an approach for quantitative aggregation in software engineering. In: Proc.3rd International Symposium on Empirical Software Engineering and Measurement (ESEM 2009), pp. 133-144.
[14]
Dillon A, McKnight C, Richardson J (1988) Reading from paper versus reading from screen. Comput J 31 (5):457-464.
[15]
Fagan ME (1976) Design and code inspections to reduce errors in program development. IBM Syst J 15 (3):182-211.
[16]
Fagan ME (1986) Advances in software inspections. IEEE Trans Softw Eng SE-12(7):744-751.
[17]
Fraenkel JR, Wallen NE (2006). How to design and evaluate research in education, 6th edn. McGraw-Hill Publishing Company, New York.
[18]
Fusaro P, Lanubile F, Visaggio G (1997) A replicated experiment to assess requirements inspection techniques. Empir Softw Eng 2(1):39-57.
[19]
Garousi V (2010) Applying peer reviews in software engineering education: an experiment and lessons learned. IEEE Trans Educ 53(2):182-193.
[20]
Hungerford BC, Hevner AR, Collins RW(2004) Reviewing software diagrams: a cognitive study. IEEE Trans Softw Eng 30(2):82-96.
[21]
Johnson PM, Tjahjono D (1997) Assessing software review meetings: A controlled experimental study using CSRS. In: Proc.9th Int'l Conf. on Softw. Eng, pp. 118-127.
[22]
Juristo N, Vegas S (2009) Using differences among replications of software engineering experiments to gain knowledge. In: Proc.3rd Int'l Symp. on Emp. Softw. Eng. and Measurement, pp. 356-366.
[23]
Kitchenham BA (2008) The role of replications in empirical software engineering--a word of warning. Empir Softw Eng 13(2):219-221.
[24]
Kollanus S (2009) Experiences from using ICMM in inspection process assessment. Softw Qual J 17(2):177- 187.
[25]
Kollanus S (2011) ICMM--amaturity model for software inspections. J SoftwMaint Evol Res Pract 23(5):327-341.
[26]
Kollanus S, Kosnimen J (2009) Survey of software inspection research. The Open Software Engineering Journal 3:15-34.
[27]
Laitenberger O, DeBaud J (1997) Perspective-based reading of code documents at Robert Bosch GmbH. Inf Softw Technol 39(11):781-791.
[28]
Laitenberger O, Atkinson C, Schlich M, El Emam K (2000) An experimental comparison of reading techniques for defect detection in UML design documents. J Syst Softw 53(2):183-204.
[29]
Laitenberger O, Emam KE, Harbich TG (2001) An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Trans Softw Eng 27(5):387-421.
[30]
Land LPW, Wong B, Jeffery R (2003) An extension of the behavioral theory of group performance in software development technical reviews. In: Proc.10th Asia-Pacific Softw. Eng. Conf., pp. 520-530.
[31]
Lung J, Aranda J, Easterbrook SM, Wilson GV (2008) On the difficulty of replicating human subjects studies in software engineering. In: Proc.30th International Conference on Software Engineering (ICSE), pp. 191-200.
[32]
Martin J, Tsai W (1990) N-fold inspection: a requirements analysis technique. Commun ACM 33(2):223-232.
[33]
McCarthy P, Porter A, Siy H, LG Votta J (1996) An experiment to assess cost-benefits of inspection meetings and their alternatives: A pilot study. In: Proc. Metrics, pp. 100.
[34]
McMeekin DA, von Konsky BR, Robey M, Cooper DJA (2009) The significance of participant experience when evaluating software inspection techniques. In: Proc. Australian Software Engineering Conference (ASWEC'09), pp. 200-209.
[35]
Noyes JM, Garland KJ (2008) Computer- vs. Paper-based tasks: are they equivalent? Ergonomics 51 (9):1352-1375.
[36]
O'Hara K, Sellen A (1997) A comparison of reading paper and on-line documents. In: Proc. SIGCHI Conf. on Human Factors in Computing Systems, pp. 335-342.
[37]
Olalekan AS, Adenike OO (2008) Empirical study of factors affecting the effectiveness of software inspection: a preliminary report. Eur J Sci Res 19(4):614-627.
[38]
Parnas DL, Weiss D (1985) Active design reviews: principles and practice. In: Proc.8th Int'l Conf. on Softw. Eng., pp. 132-136.
[39]
Porter A, Votta L, Basili VR (1998) Comparing detection methods for software requirements inspections: a replication using professional subjects. Empir Softw Eng 3(4):355-379.
[40]
Regnell B, Runeson P, Thelin T (2000) Are the perspectives really different? Further experimentation on scenario-based reading of requirements. Empir Softw Eng 5(4):331-356.
[41]
Robbins B, Carver J (2009) Cognitive factors in perspective-based reading (PBR): A protocol analysis study. In: Proc.3rd International SymposiumonEmpirical Software Engineering and Metrics. Oct. 15-16, pp. 145-155.
[42]
Sandahl K, Blomkvist O, Karlsson J, Krysander C, Lindvall M, Ohlsson N (1998) An extended replication of an experiment for assessing methods for software requirements inspections. Empir Softw Eng 3(4):327-354.
[43]
Sauer C, Jeffery DR, Land L, Yetton P (2000) The effectiveness of software development technical reviews: a behaviorally motivated program of research. IEEE Trans Softw Eng 26(1):1-14.
[44]
Schneider GM, Martin J, Tsai WT (1992) An experimental study of fault detection in user requirements documents. ACM Trans Softw Eng Methodol 1(2):188-204.
[45]
Shull F, Carver J, Travassos G (2001) An empirical methodology for introducing software processes. In: Proc. Joint 8th Eur. Softw. Eng. Conf. and 9thACMSIGSOFT Foundations of Softw. Eng. Sept. 10-14, 2001, pp. 288-296.
[46]
Shull F, Basili V, Carver J, Maldonado J, Travassos G, Mendonca M, Fabbri S (2002) Replicating software engineering experiments: Addressing the tacit knowledge problem. In: Proc.1st Int'l Symp. on Emp. Softw. Eng. Oct. 3-4, 2002, pp. 7-16.
[47]
Shull F, Mendonca M, Basili V, Carver J, Maldonado J, Fabbri S, Travassos G, Ferreira M (2004) Knowledge-sharing issues in experimental software engineering. Empir Softw Eng 9(1):111-137.
[48]
Shull F, Carver J, Vegas S, Juristo N (2008) The role of replications in empirical software engineering. Empir Softw Eng 13(2):211-218.
[49]
The Joint Task Force on Computing Curricula, IEEE Computer Society, Association for Computing Machinery, Software Engineering (2004) Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering, Retrieved July 19, 2012 from http://sites.computer.org/ccse/SE2004Volume.pdf
[50]
Travassos G, Shull F, Fredericks M, Basili V (1999a) Detecting defects in object oriented designs: Using reading techniques to increase software quality. In: Proc. OOPSLA '99.
[51]
Travassos G, Shull F, Carver J (1999b) Reading techniques for OO design inspections. In: Proc. 24th NASA Softw. Eng. Wksp.
[52]
Votta L (1993) Does every inspection need a meeting? In: Proc. ACM SIGSOFT Symp. on the Foundations of Softw. Eng, pp. 107-114.
[53]
Winkler D, Thurnher B, Biffl S (2007) Early software product improvement with sequential inspection sessions: An empirical investigation of inspector capability and learning effects. In: Proc.33rd EURO-MICRO Conference on Software Engineering and Advanced Applications, pp. 245-254.
[54]
Winkler D, Biffl S, Faderl K (2010) Investigating the temporal behavior of defect detection in software inspection and inspection-based testing. In: Proc. Product-Focused Software Process Improvement., pp. 17-31.
[55]
Wong YK (2011) Do developers matter in system review? Behav Inform Technol 30(3):353-378.
[56]
Zhang Z, Basili V, Shneiderman B (1999) Perspective-based usability inspection: an empirical validation of efficacy. Empir Softw Eng 4(1):43-70.

Cited By

View all
  • (2023)Investigating Factors Influencing Students’ Assessment of Conceptual ModelsProceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering10.1145/3593434.3593960(470-474)Online publication date: 14-Jun-2023
  • (2023)Effect of Requirements Analyst Experience on Elicitation Effectiveness: A Family of Quasi-ExperimentsIEEE Transactions on Software Engineering10.1109/TSE.2022.321007649:4(2088-2106)Online publication date: 1-Apr-2023
  • (2023)Operationalizing validity of empirical software engineering studiesEmpirical Software Engineering10.1007/s10664-023-10370-328:6Online publication date: 13-Nov-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Empirical Software Engineering
Empirical Software Engineering  Volume 19, Issue 1
February 2014
266 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 01 February 2014

Author Tags

  1. Empirical studies
  2. Replication
  3. Requirements
  4. Software engineering
  5. Software inspections

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Investigating Factors Influencing Students’ Assessment of Conceptual ModelsProceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering10.1145/3593434.3593960(470-474)Online publication date: 14-Jun-2023
  • (2023)Effect of Requirements Analyst Experience on Elicitation Effectiveness: A Family of Quasi-ExperimentsIEEE Transactions on Software Engineering10.1109/TSE.2022.321007649:4(2088-2106)Online publication date: 1-Apr-2023
  • (2023)Operationalizing validity of empirical software engineering studiesEmpirical Software Engineering10.1007/s10664-023-10370-328:6Online publication date: 13-Nov-2023
  • (2022)A model-based approach for specifying changes in replications of empirical studies in computer ScienceComputing10.1007/s00607-022-01133-x105:6(1189-1213)Online publication date: 3-Dec-2022
  • (2016)Using Eye Tracking to Investigate Reading Patterns and Learning Styles of Software Requirement Inspectors to Enhance Inspection Team OutcomeProceedings of the 10th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement10.1145/2961111.2962598(1-10)Online publication date: 8-Sep-2016
  • (2016)Effect of Domain Knowledge on Elicitation Effectiveness: An Internally Replicated Controlled ExperimentIEEE Transactions on Software Engineering10.1109/TSE.2015.249458842:5(427-451)Online publication date: 1-May-2016
  • (2014)Evidence of the presence of bias in subjective metricsProceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering10.1145/2601248.2601291(1-4)Online publication date: 13-May-2014

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media