Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/520792.823953guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Can Results from Software Engineering Experiments be Safely Combined?

Published: 04 November 1999 Publication History
  • Get Citation Alerts
  • Abstract

    Deriving reliable empirical results from a single experiment is an unlikely event. Hence to progress multiple experiments must be undertaken per hypothesis and the subsequent results effectively combined to produce a single reliable conclusion. Other disciplines use meta-analytic techniques to achieve this result. The treatise of this paper is: can meta-analysis be successfully applied to current Software Engineering experiments?The question is investigated by examining a series of experiments, which themselves investigate - which defect detection technique is best? Applying meta-analysis techniques to the Software Engineering data is relatively straightforward, but unfortunately the results are highly unstable, as the meta-analysis shows that the results are highly disparate and don't lead to a single reliable conclusion.The reason for this deficiency is the excessive variation within various components of the experiments. Finally the paper describes a number of recommendations for controlling and reporting empirical work to advance the discipline towards a position where meta-analysis can be profitably employed.

    Cited By

    View all
    • (2018)Statistical errors in software engineering experimentsProceedings of the 40th International Conference on Software Engineering10.1145/3180155.3180161(1195-1206)Online publication date: 27-May-2018
    • (2014)Guidelines for snowballing in systematic literature studies and a replication in software engineeringProceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering10.1145/2601248.2601268(1-10)Online publication date: 13-May-2014
    • (2010)Replications types in experimental disciplinesProceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement10.1145/1852786.1852790(1-10)Online publication date: 16-Sep-2010
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    METRICS '99: Proceedings of the 6th International Symposium on Software Metrics
    November 1999
    ISBN:0769504035

    Publisher

    IEEE Computer Society

    United States

    Publication History

    Published: 04 November 1999

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 10 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2018)Statistical errors in software engineering experimentsProceedings of the 40th International Conference on Software Engineering10.1145/3180155.3180161(1195-1206)Online publication date: 27-May-2018
    • (2014)Guidelines for snowballing in systematic literature studies and a replication in software engineeringProceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering10.1145/2601248.2601268(1-10)Online publication date: 13-May-2014
    • (2010)Replications types in experimental disciplinesProceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement10.1145/1852786.1852790(1-10)Online publication date: 16-Sep-2010
    • (2006)Empirical software engineeringProceedings of the 2006 international conference on Empirical software engineering issues: critical assessment and future directions10.5555/1767399.1767459(135-142)Online publication date: 26-Jun-2006
    • (2005)Accumulation and presentation of empirical evidenceProceedings of the 2005 workshop on Realising evidence-based software engineering10.1145/1083174.1083178(1-3)Online publication date: 17-May-2005
    • (2005)Accumulation and presentation of empirical evidenceACM SIGSOFT Software Engineering Notes10.1145/1082983.108317830:4(1-3)Online publication date: 17-May-2005
    • (2005)Experimental context classificationProceedings of the 27th international conference on Software engineering10.1145/1062455.1062539(470-478)Online publication date: 15-May-2005
    • (2005)An Empirical Exploration of the Distributions of the Chidamber and Kemerer Object-Oriented Metrics SuiteEmpirical Software Engineering10.1023/B:EMSE.0000048324.12188.a210:1(81-104)Online publication date: 1-Jan-2005
    • (2003)A Procedure for Assessing the Influence of Problem Domain on Effort Estimation ConsistencySoftware Quality Journal10.1023/A:102586101112611:4(283-300)Online publication date: 1-Nov-2003
    • (2002)Measuring Effort Estimation Uncertainty to Improve Client ConfidenceSoftware Quality Journal10.1023/A:102052392371510:2(135-148)Online publication date: 1-Sep-2002
    • Show More Cited By

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media