Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3167132.3167293acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

An experimental evaluation of a de-biasing intervention for professional software developers

Published: 09 April 2018 Publication History
  • Get Citation Alerts
  • Abstract

    Context: The role of expert judgement is essential in our quest to improve software project planning and execution. However, its accuracy is dependent on many factors, not least the avoidance of judgement biases, such as the anchoring bias, arising from being influenced by initial information, even when it's misleading or irrelevant. This strong effect is widely documented.
    Objective: We aimed to replicate this anchoring bias using professionals and, novel in a software engineering context, explore de-biasing interventions through increasing knowledge and awareness of judgement biases.
    Method: We ran two series of experiments in company settings with a total of 410 software developers. Some developers took part in a workshop to heighten their awareness of a range of cognitive biases, including anchoring. Later, the anchoring bias was induced by presenting low or high productivity values, followed by the participants' estimates of their own project productivity. Our hypothesis was that the workshop would lead to reduced bias, i.e., work as a de-biasing intervention.
    Results: The anchors had a large effect (robust Cohen's d = 1.19) in influencing estimates. This was substantially reduced in those participants who attended the workshop (robust Cohen's d = 0.72). The reduced bias related mainly to the high anchor. The de-biasing intervention also led to a threefold reduction in estimate variance.
    Conclusion: The impact of anchors upon judgement was substantial. Learning about judgement biases does appear capable of mitigating, although not removing, the anchoring bias. The positive effect of de-biasing through learning about biases suggests that it has value.

    References

    [1]
    J. Aranda and S. Easterbrook. 2005. Anchoring and Adjustment in Software Estimation. In ESEC-FSE'05. ACM Press, 346--355.
    [2]
    D. Ariely, G. Loewenstein, and D. Prelec. 2003. "Coherent arbitrariness": Stable demand curves without stable preferences. The Quarterly Journal of Economics 118, 1 (2003), 73--106.
    [3]
    M. Brown and A. Forsythe. 1974. Robust tests for the equality of variances. Journal of The American Statistical Association 69, 346 (1974), 364--367.
    [4]
    R. Buehler, D. Griffin, and M. Ross. 1994. Exploring the 'Planning Fallacy': why people underestimate their task completion times. Journal of Personality & Social Psychology 67, 3 (1994), 366--381.
    [5]
    R. Buehler, J. Peetz, and D. Griffin. 2010. Finishing on time: When do predictions influence completion times? Organizational Behavior and Human Decision Processes 111, 1 (2010), 23--32.
    [6]
    M. Deutsch and H. Gerard. 1955. A study of normative and informational social influences upon individual judgment. The Journal of Abnormal and Social Psychology 51, 3 (1955), 629--636.
    [7]
    P. Ellis. 2010. The Essential Guide to Effect Sizes: Statistical Power, Meta-Analysis, and the Interpretation of Research Results. Cambridge University Press.
    [8]
    B. Fischoff. 1981. Debiasing. Technical Report. DECISION RESEARCH EUGENE OR.
    [9]
    A. Furnham and H. Boo. 2011. A literature review of the anchoring effect. The Journal of Socio-Economics 40, 1 (2011), 35--42.
    [10]
    T. Halkjelsvik and M. Jørgensen. 2012. From origami to software development: A review of studies on judgment-based predictions of performance time. Psychological Bulletin 138, 2 (2012), 238--271.
    [11]
    Katherine Hansen, Margaret Gerbasi, Alexander Todorov, Elliott Kruse, and Emily Pronin. 2014. People Claim Objectivity After Knowingly Using Biased Strategies. Personality and Social Psychology Bulletin 40, 6 (2014), 691--699.
    [12]
    C. Jansen and M. Pollmann. 2001. On round numbers: Pragmatic aspects of numerical expressions. Journal of Quantitative Linguistics 8, 3 (2001), 187--201.
    [13]
    M. Jørgensen. 2004. A review of studies on expert estimation of software development effort. Journal of Systems and Software 70, 1 (2004), 37--60.
    [14]
    M. Jørgensen. 2007. Forecasting of software development work effort: Evidence on expert judgement and formal models. International Journal of Forecasting 23, 3 (2007), 449--462.
    [15]
    M. Jørgensen and S. Grimstad. 2011. The impact of irrelevant and misleading information on software development effort estimates: A randomized controlled field experiment. IEEE Transactions on Software Engineering 37, 5 (2011), 695--707.
    [16]
    M. Jørgensen and S. Grimstad. 2012. Software Development Estimation Biases: The Role of Interdependence. IEEE Transactions on Software Engineering 38, 3 (2012), 677--693.
    [17]
    M. Jørgensen and M. Shepperd. 2007. A Systematic Review of Software Development Cost Estimation Studies. IEEE Transactions on Software Engineering 33, 1 (2007), 33--53.
    [18]
    M. Jørgensen and D. Sjøberg. 2004. The impact of customer expectation on software development effort estimates. International Journal of Project ManagementÊ 22, 4 (2004), 317--325.
    [19]
    D. Kahneman. 1999. Objective happiness. Russell Sage Foundation, New York, 3--25.
    [20]
    D. Kahneman, D. Lovallo, and O. Sibony. 2011. Before you make that big decision. Harvard Business Review 89, 6 (2011), 50--60.
    [21]
    D. Kahneman, P. Slovic, and A. Tversky. 1982. Judgment under uncertainty: Heuristics and biases. Cambridge University Press, Cambridge, UK.
    [22]
    J. Klayman and K. Brown. 1993. Debias the environment instead of the judge: an alternative approach to reducing error in diagnostic (and other) judgment. Cognition 49, 1--2 (1993), 97--122.
    [23]
    E. Løhre and M. Jørgensen. 2016. Numerical anchors and their strong effects on software development effort estimates. Journal of Systems and Software 116 (2016), 49--56.
    [24]
    D. Lovallo and O. Sibony. 2010. The case for behavioral strategy. JMcKinsey Quarterly 2010, 2 (2010), 30--43.
    [25]
    P. Mair and R. Wilcox. 2016. Robust Statistical Methods in R: Using the WRS2 Package. Technical Report. Harvard University. https://rdrr.io/rforge/WRS2/f/inst/doc/WRS2.pdf
    [26]
    R. Malhotra. 2015. A systematic review of machine learning techniques for software fault prediction. Applied Soft Computing 27 (2015), 504--518.
    [27]
    C. Morewedge, H. Yoon, I. Scopelliti, C. Symborski, J. Korris, and K. Kassam. 2015. Debiasing decisions: Improved decision making with a single training intervention. Policy Insights from the Behavioral and Brain Sciences 2, 1 (2015), 129--140.
    [28]
    T. Mussweiler and F. Strack. 1999. Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology 35, 2 (1999), 136--164.
    [29]
    T. Mussweiler and F. Strack. 2001. The Semantics of Anchoring. Organizational Behavior and Human Decision Processes 86, 2 (2001), 234--255.
    [30]
    T. Mussweiler, F. Strack, and T. Pfeiffer. 2000. Overcoming the inevitable anchoring effect: Considering the opposite compensates for selective accessibility. Personality and Social Psychology Bulletin 26, 9 (2000), 1142--1150.
    [31]
    G. Oliver, G. Oliver, and R. Body. 2017. BET 2: Poor evidence onwhether teaching cognitive debiasing, or cognitive forcing strategies, lead to a reduction in errors attributable to cognition in emergency medicine students or doctors. Emergency Medicine Journal 34, 8 (2017), 553--554.
    [32]
    F. Strack and T. Mussweiler. 1997. Explaining the enigmatic anchoring effect: Mechanisms of selective accessibility. Journal of Personality and Social Psychology 73, 3 (1997), 437--446.
    [33]
    A. Tversky and D. Kahneman. 1974. Judgment under Uncertainty: Heuristics and Biases. Science 185, 4157 (1974), 1124--1131.
    [34]
    D. Wegener, R. Petty, B. Detweiler-Bedell, and W. Jarvis. 2001. Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology 37, 1 (2001), 62--69.
    [35]
    N. Weinstein. 1980. Unrealistic optimism about future life events. Journal of Personality and Social Psychology 39, 5 (1980), 806--820.
    [36]
    M Welsh, S Begg, and R Bratvold. 2007. Efficacy of bias awareness in debiasing oil and gas judgments. In 29th Annual Cognitive Science Society. Cognitive Science Society, 1647--1652.
    [37]
    R. Wilcox. 2012. Introduction to Robust Estimation and Hypothesis Testing (3rd ed.). Academic Press.

    Cited By

    View all
    • (2023)Fixations in Agile Software Development TeamsFoundations of Computing and Decision Sciences10.2478/fcds-2023-000148:1(3-18)Online publication date: 19-Mar-2023
    • (2023)Investigating the Relation between Requirements Framing and Confirmation Bias in TestingProceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering10.1145/3593434.3593447(298-303)Online publication date: 14-Jun-2023
    • (2023)The Risk-Taking Software Engineer: A Framed Portrait2023 IEEE/ACM 45th International Conference on Software Engineering: New Ideas and Emerging Results (ICSE-NIER)10.1109/ICSE-NIER58687.2023.00011(25-30)Online publication date: May-2023
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SAC '18: Proceedings of the 33rd Annual ACM Symposium on Applied Computing
    April 2018
    2327 pages
    ISBN:9781450351911
    DOI:10.1145/3167132
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 April 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. cognitive bias
    2. expert judgement
    3. software effort estimation
    4. software engineering experimentation

    Qualifiers

    • Research-article

    Funding Sources

    • EPSRC

    Conference

    SAC 2018
    Sponsor:
    SAC 2018: Symposium on Applied Computing
    April 9 - 13, 2018
    Pau, France

    Acceptance Rates

    Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)25
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 26 Jul 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Fixations in Agile Software Development TeamsFoundations of Computing and Decision Sciences10.2478/fcds-2023-000148:1(3-18)Online publication date: 19-Mar-2023
    • (2023)Investigating the Relation between Requirements Framing and Confirmation Bias in TestingProceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering10.1145/3593434.3593447(298-303)Online publication date: 14-Jun-2023
    • (2023)The Risk-Taking Software Engineer: A Framed Portrait2023 IEEE/ACM 45th International Conference on Software Engineering: New Ideas and Emerging Results (ICSE-NIER)10.1109/ICSE-NIER58687.2023.00011(25-30)Online publication date: May-2023
    • (2023)Much more than a prediction: Expert-based software effort estimation as a behavioral actEmpirical Software Engineering10.1007/s10664-023-10332-928:4Online publication date: 5-Jul-2023
    • (2022)SEXTAMTJournal of Systems and Software10.1016/j.jss.2021.111148185:COnline publication date: 1-Mar-2022
    • (2022)Business as Usual Forever? Psychological Mechanisms of Inaction and How Disruptive Communication Might HelpDisruptive Environmental Communication10.1007/978-3-031-17165-9_2(19-42)Online publication date: 12-Nov-2022
    • (2022)Debiasing Architectural Decision-Making: A Workshop-Based Training ApproachSoftware Architecture10.1007/978-3-031-16697-6_11(159-166)Online publication date: 9-Sep-2022
    • (2022)Is Knowledge the Key? An Experiment on Debiasing Architectural Decision-Making - a Pilot StudyProduct-Focused Software Process Improvement10.1007/978-3-030-91452-3_14(207-214)Online publication date: 1-Jan-2022
    • (2021)Trust yourself! Or maybe not: factors related to overconfidence and uncertainty assessments of software effort estimatesProceedings of the XXXV Brazilian Symposium on Software Engineering10.1145/3474624.3474643(452-461)Online publication date: 27-Sep-2021
    • (2020)Cognitive Biases in Software Engineering: A Systematic Mapping StudyIEEE Transactions on Software Engineering10.1109/TSE.2018.287775946:12(1318-1339)Online publication date: 1-Dec-2020
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media