Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1368088.1368138acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Are fit tables really talking?: a series of experiments to understand whether fit tables are useful during evolution tasks

Published: 10 May 2008 Publication History

Abstract

Test-driven software development tackles the problem of operationally defining the features to be implemented by means of test cases. This approach was recently ported to the early development phase, when requirements are gathered and clarified. Among the existing proposals, Fit (Framework for Integrated Testing) supports the precise specification of requirements by means of so called Fit tables, which express relevant usage scenarios in a tabular format, easily understood also by the customer. Fit tables can be turned into executable test cases through the creation of pieces of glue code, called fixtures.
In this paper, we test the claimed benefits of Fit through a series of three controlled experiments in which Fit tables and related fixtures are used to clarify a set of change requirements, in a software evolution scenario. Results indicate improved correctness achieved with no significant impact on time, however benefits of Fit vary in a substantial way depending on the developers' experience. Preliminary results on the usage of Fit in combination with pair programming revealed another relevant source of variation.

References

[1]
J. Aarniala. Acceptance testing. In whitepaper. www.cs.helsinki.fi/u/jaarnial/jaarnial-testing.pdf, October 30 2006.
[2]
G. Bache, J. Andersson, and P. Sutton. Xp with acceptance test driven development - a rewrite project for a resource optimization system. In Proceedings of the 8th International Conference on Agile Processes in Software Engineering and eXtreme Programming (XP 2003), pages 180--188. Springer, 2003.
[3]
J. Cohen. Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Earlbaum Associates, Hillsdale, NJ, 1988.
[4]
C. Deng, P. Wilson, and F. Maurer. Fitclipse: A fit-based eclipse plug-in for executable acceptance test driven development. In Proceedings of the 8th International Conference on Agile Processes in Software Engineering and eXtreme Programming (XP 2007). Springer, 2007.
[5]
N. Juristo and A. Moreno. Basics of Software Engineering Experimentation. Kluwer Academic Publishers, Englewood Cliffs, NJ, 2001.
[6]
K. Leung and W. Yeung. Generating user acceptance test plans from test cases. In Proceedings of the 31st Annual International Computer Software and Applications Conference - Vol. 2- (COMPSAC 2007), pages 737--742, Los Alamitos, CA, USA, 2007. IEEE Computer Society.
[7]
G. Melnik, F. Maurer, and M. Chiasson. Executable acceptance tests for communicating business requirements: customer requirements. In Proceedings of AGILE 2006 Conference (AGILE2006), pages 35--46, Los Alamitos, CA, USA, 2006. IEEE Computer Society.
[8]
G. Melnik, K. Read, and F. Maurer. Suitability of fit user acceptance tests for specifying functional requirements: Developer perspective. In Extreme programming and agile methods - XP/Agile Universe 2004, pages 60--72, August 2004.
[9]
B. Meyer. On formalism in specification. IEEE Software, January 1985.
[10]
R. Mugridge and W. Cunningham. Fit for Developing Software: Framework for Integrated Tests. Prentice Hall, 2005.
[11]
A. N. Oppenheim. Questionnaire Design, Interviewing and Attitude Measurement. Pinter, London, 1992.
[12]
T. Ostrand and M. Balcer. The category-partition method for specifying and generating functional tests. Communications of the Association for Computing Machinery, 31(6), June 1988.
[13]
K. Read, G. Melnik, and F. Maurer. Examining usage patters of the fit acceptance testing framework. In Proc. 6th International Conference on eXtreme Programming and Agile Processes in Software Engineering (XP2005), pages Lecture Notes in Computer Science, Vol. 3556, Springer Verlag: 127--136 2005, June 18-23 2005.
[14]
F. Ricca, M. Torchiano, M. Ceccato, and P. Tonella. Talking tests: an empirical assessment of the role of fit acceptance tests in clarifying requirements. In International Workshop on Principles of Software Evolution (IWPSE 2007), pages 51--58. ACM Press, September 2007.
[15]
F. Ricca, M. Torchiano, M. Di Penta, M. Ceccato, and P. Tonella. The use of executable fit tables to support maintenance and evolution tasks. Electronic Communications of the EASST, 8, 2008.
[16]
J. Sauvè, O. Abath Neto, and W. Cirne. EasyAccept: a tool to easily create, run, and drive development with automated acceptance tests. In Proceedings of 2006 international workshop on Automation of software test, pages 111 -- 117, New York, NY, USA, 2006. ACM Press.
[17]
C. Wohlin, P. Runeson, M. Höst, M. Ohlsson, B. Regnell, and A. Wesslén. Experimentation in Software Engineering - An Introduction. Kluwer Academic Publishers, 2000.
[18]
R. Young. Effective Requirements Practice. Addison-Wesley, Boston, MA, 2001.

Cited By

View all
  • (2023)Too Simple? Notions of Task Complexity used in Maintenance-based Studies of Programming Tools2023 IEEE/ACM 31st International Conference on Program Comprehension (ICPC)10.1109/ICPC58990.2023.00040(254-265)Online publication date: May-2023
  • (2020)Analyzing Families of Experiments in SE: A Systematic Mapping StudyIEEE Transactions on Software Engineering10.1109/TSE.2018.286463346:5(566-583)Online publication date: 1-May-2020
  • (2020)On the Effect of Noise on Software Engineers’ Performance: Results from Two Replicated Experiments2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA)10.1109/SEAA51224.2020.00062(334-341)Online publication date: Aug-2020
  • Show More Cited By

Index Terms

  1. Are fit tables really talking?: a series of experiments to understand whether fit tables are useful during evolution tasks

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICSE '08: Proceedings of the 30th international conference on Software engineering
      May 2008
      558 pages
      ISBN:9781605580791
      DOI:10.1145/1368088
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 10 May 2008

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. acceptance test
      2. empirical studies
      3. software maintenance

      Qualifiers

      • Research-article

      Conference

      ICSE '08
      Sponsor:

      Acceptance Rates

      ICSE '08 Paper Acceptance Rate 56 of 370 submissions, 15%;
      Overall Acceptance Rate 276 of 1,856 submissions, 15%

      Upcoming Conference

      ICSE 2025

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)10
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 19 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Too Simple? Notions of Task Complexity used in Maintenance-based Studies of Programming Tools2023 IEEE/ACM 31st International Conference on Program Comprehension (ICPC)10.1109/ICPC58990.2023.00040(254-265)Online publication date: May-2023
      • (2020)Analyzing Families of Experiments in SE: A Systematic Mapping StudyIEEE Transactions on Software Engineering10.1109/TSE.2018.286463346:5(566-583)Online publication date: 1-May-2020
      • (2020)On the Effect of Noise on Software Engineers’ Performance: Results from Two Replicated Experiments2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA)10.1109/SEAA51224.2020.00062(334-341)Online publication date: Aug-2020
      • (2019)End-to-end model-transformation comprehension through fine-grained traceability informationSoftware and Systems Modeling (SoSyM)10.1007/s10270-017-0602-018:2(1305-1344)Online publication date: 1-Apr-2019
      • (2019)Empirical Software EngineeringHandbook of Software Engineering10.1007/978-3-030-00262-6_7(285-320)Online publication date: 12-Feb-2019
      • (2018)The effect of noise on software engineers' performanceProceedings of the 12th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement10.1145/3239235.3240496(1-10)Online publication date: 11-Oct-2018
      • (2018)Exploring the Use of Rapid Type Analysis for Detecting the Dead Method Smell in Java Code2018 44th Euromicro Conference on Software Engineering and Advanced Applications (SEAA)10.1109/SEAA.2018.00035(167-174)Online publication date: Aug-2018
      • (2018)On the impact of state-based model-driven development on maintainability: a family of experiments using UniModEmpirical Software Engineering10.1007/s10664-017-9563-823:3(1743-1790)Online publication date: 1-Jun-2018
      • (2016)Literature Review of Empirical Research Studies within the Domain of Acceptance Testing2016 42th Euromicro Conference on Software Engineering and Advanced Applications (SEAA)10.1109/SEAA.2016.33(181-188)Online publication date: Aug-2016
      • (2015)Executable Test Specifications Show No Negative Effect on Finding and Correcting Test Faults in Event Processing Application DevelopmentProceedings of the 2015 41st Euromicro Conference on Software Engineering and Advanced Applications10.1109/SEAA.2015.16(18-26)Online publication date: 26-Aug-2015
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media