Read - Cannon & Edmondson
Read - Cannon & Edmondson
Read - Cannon & Edmondson
com
Organizations are widely encouraged to learn from their failures, but it is something most
find easier to espouse than to effect. This article synthesizes the authors’ wide research
in this field to offer a strategy for achieving the objective. Their framework relates techni-
cal and social barriers to three key activities e identifying failure, analyzing failure and
deliberate experimentation e to develop six recommendations for action. They suggest
that these be implemented as an integrated set of practices by leaders who can ‘walk the
talk’ and work to shift the managerial mindset in a way that redefines failure away from its
discreditable associations, and view it instead as a critical first step in a journey of discovery
and learning.
Ó 2005 Elsevier Ltd. All rights reserved
Introduction
The idea that people and the organizations in which they work should learn from failure has
considerable popular support e and even seems obvious e yet organizations that systematically
learn from failure are rare. This article provides insight into what makes learning from failure so
difficult to put into practice e that is, we address the question of why organizations fail to learn
from failure.
We also note that very few organizations experiment effectively e an activity that necessarily
generates failures while trying to discover successes e to maximize the opportunity for learning
from failure and minimize its cost. In short, we argue that organizations should not only learn from
0024-6301/$ - see front matter Ó 2005 Elsevier Ltd. All rights reserved.
doi:10.1016/j.lrp.2005.04.005
failure e they should learn to fail intelligently as a deliberate strategy to promote innovation and
improvement. In this article, we identify the barriers embedded in both technical and social systems
that make such intelligent use of failure rare in organizations, and we offer recommendations for
managers seeking to improve their organization’s ability to learn from failure.
Ironically enough, the higher people are in the management hierarchy, the more they tend to
supplement their perfectionism with blanket excuses, with CEOs usually being the worst of all. For
example, in one organization we studied, the CEO spent the entire forty-five-minute interview
explaining all the reasons why others were to blame for the calamity that hit his company.
Regulators, customers, the government, and even other executives within the firm--all were
responsible. No mention was made, however, of personal culpability.13
Identifying failure
Proactive and timely identification of failures is an essential first step in the process of learning from
them. One of the revolutions in manufacturing - the drive to reduce inventory to the lowest
possible levels - was stimulated as much by the desire to make problems and errors quickly visible as
by the desire to avoid other inventory-associated costs. As Hayes and his colleagues have noted,
surfacing errors before they are compounded, incorporated into larger systems, or made
irrevocable, is an essential step in achieving high quality.
Indeed, one of the tragedies in organizational learning is that catastrophic failures are often
preceded by smaller failures that were not identified as being worthy of examination and learning.
In fact, these small failures are often the key ‘early warning signs’ that can provide the wake up call
needed to avert disaster further down the road. Social system barriers are often the key driver of this
kind of problem. Rather than acknowledge and address a small failure, individuals have a tendency
to deny the failure, distort the reality of the failure, or cover it up, and groups and organizations
have the tendency to suppress awareness of failures.
pulled a $450 ‘mistake’ out of the company’s dumpster, mounted it on a plaque, and named it the
‘no-nuts award’ - for the missing parts. A presentation ceremony followed at the company barbecue.
‘You can bet no one makes that mistake any more,’ the CEO says. ‘The winner, who was initially
embarrassed, now takes pride in the fact that his mistake has saved this company a lot of money.’17
Analyzing failure
It hardly needs to be said that organizations cannot learn from failures if people do not discuss and
analyze them. Yet this remains an important insight. The learning that is potentially available may
not be realized unless thoughtful analysis and discussion of failure occurs. For example, for Kaiser’s
Dr. Adcock, it is not enough just to know that a particular physician is making more than the
acceptable number or errors. Unless deeper analysis of the nature of the radiologists’ errors is
conducted, it is difficult to learn what needs to be corrected. On a larger scale, the US Army is
known for conducting After Action Reviews that enable participants to analyze, discuss and learn
from both the successes and failures of a variety of military initiatives. Similarly, hospitals use
‘Morbidity and Mortality’ (M&M) conferences (in which physicians convene to discuss significant
mistakes or unexpected deaths) as a forum for identifying, discussing and learning from failures.
This analysis can only be effective if people speak up openly about what they know and if others
listen, enabling a new understanding of what happened to emerge in the assembled group. Many of
these vehicles for analysis only address substantial failures, however, rather than identifying and
learning from smaller ones.
An example of effective analysis of failure is found in the meticulous and painstaking analysis that
goes into understanding the crash of an airliner. Hundreds of hours may go into gathering and
analyzing data to sort out exactly what happened and what can be learned. Compare this kind of
analysis to what takes place in most organizations after a failure
As noted above, social systems tend to discourage this kind of analysis. First, individuals
experience negative emotions when examining their own failures and this can chip away at self-
confidence and self-esteem. Most people prefer to put past mistakes behind them rather than revisit
and unpack them for greater understanding.
Second, conducting an analysis of a failure requires a spirit of inquiry and openness, patience and
a tolerance for ambiguity. However, most managers admire and are rewarded for decisiveness,
efficiency and action rather than for deep reflection and painstaking analysis.
Third, psychologists have spent decades documenting heuristics and psychological biases and
errors that reduce the accuracy of human perception, sense making, estimation, and attribution.24
These can hinder the human ability to analyze failure effectively.
People tend to be more comfortable attending to evidence that enables them to believe what they
want to believe, denying responsibility for failures, and attributing the problem to others or to ‘the
system’. We would prefer to move on to something more pleasant. Rigorous analysis of failure
requires that people, at least temporarily, put aside these tendencies to explore unpleasant truths
and take personal responsibility. Evidence of this problem is provided by a study of a large
European telecoms company, which revealed that very little learning occurred from a set of large
and small failures over a period of twenty years. Instead of realistic and thorough analysis, managers
tended to offer ready rationalizations for the failures. Specifically, managers attributed large failures
to uncontrollable events outside the organization (e.g., the economy) and to the intervention of
outsiders. Small failures were interpreted as flukes, the natural outcomes of experimentation, or as
illustrations of the folly of not adhering strictly to the company’s core beliefs.25
Similarly, we have observed failed consulting relationships in our field research in which the
consultants simply blamed the failure on the client, concluding that the client was not
really committed to change, or that the client was defensive or difficult. By contrast, a few highly
learning-oriented consultants were able to engage in discussion and analysis that involved raising
questions about how they themselves contributed to the problem. In these analytic sessions, the
Deliberate experimentation
The third active process used by organizations to learn from failure is the most provocative. A
handful of exceptional organizations not only seek to identify and analyze failures, they actively
increase their chances of experiencing failure by experimenting. They recognize failure as
a necessary by-product of true experimentation, that is, experiments carried out for the express
purpose of learning and innovating. By devoting some portion of their energy to trying new things,
to find out what might work and what will not, firms certainly run the risk of increasing the
frequency of failure. But they also open up the possibility of generating novel solutions to problems
and new ideas for products, services and innovations. In this way, new ideas are put to the test, but
in a controlled context.
Experiments are understood to have uncertain outcomes and to be designed for learning. Despite
the increased rate of failure that accompanies deliberate experimentation, organizations that
experiment effectively are likely to be more innovative, productive and successful than those that do
not take such risks.32 Similarly, other research has confirmed that those research and development
teams that experimented frequently performed better than other teams.33
Social systems can make deliberate experimentation difficult because most organizations reward
success, not failure. Purposefully setting out to experiment e thus generating and accepting some
failures alongside some successes e although reasonable, is difficult in a general business culture
where failures are stigmatized. Conducting experiments also involves acknowledging that the status
quo is imperfect and could benefit from change. A psychological bias known as the confirmation
the key learning from failure, and that intelligent experimental design is a critical tool for
innovation and learning. With this basic understanding, employees are better able to recognize
when they either need to receive more specialized training themselves or to engage the assistance of
someone else who has benefited from such training.
Reframing failure
These above recommendations are best implemented as an integrated set of practices accompanied
by an encompassing shift in managerial mindset. Table 2 summarizes this shift.
First, failure must be viewed not as a problematic aberration that should never occur, but rather
as an inevitable aspect of operating in a complex and changing world. This is of course not to say
leaders should encourage people to make mistakes, but rather to acknowledge that failures are
inevitable, and hence the best thing to do is to learn as much as possible e especially from small
ones d so as to make larger ones less likely. Beliefs about effective performance should reflect this.
This implies holding people accountable, not for avoiding failure, but for failing intelligently, and
for how much they learn from their failures.
Of course, whether a failure turns out to be intelligent or not is sometimes not easy to know at
the outset of an experiment. To provide managers with some guidelines, organizational scholar Sim
Sitkin identifies five characteristics of intelligent failures: (1) They result from thoughtfully planned
actions, (2) have uncertain outcomes, (3) are of modest scale, (4) are executed and responded to
with alacrity, and (5) take place in domains that are familiar enough to permit effective learning.45
Managers would also be smart to consider their organization’s current issues related to risk
management as they develop experiments. By considering these criteria in advance, and by
analyzing and learning from previous experiments, managers are able to increase the chances that
their failures will be intelligent.
Examples of unintelligent failure include making the same mistake over and over again, failing
due to carelessness, or conducting a poorly designed experiment that will not produce helpful
learning. In addition, managers need to create an environment in which they and their employees
Expectations about failure Failure is not acceptable Failure is a natural byproduct of a healthy
process of experimentation and learning
Beliefs about effective Involves avoiding failure Involves learning from intelligent failure and
performance communicating the lessons broadly in the
organization
Conclusion
This article starts from the observation that few organizations make effective use of failures for
learning due to formidable and deep-rooted barriers. In particular, small failures can be valuable
sources of learning, presenting ‘early warning signs’. However, they are often ignored, and thus
their valuable lessons for preventing serious harm are missed. We show that properties of technical
systems combine with properties of social systems in most organizations to make failures’ lessons
especially difficult to glean. At the same time, we highlight noteworthy exceptions e organizations
that have done a superb job of making failures visible, analyzing them systematically, or even
deliberately encouraging intelligent failures as part of thoughtful experimentation.
Organizational learning from failure is thus not impossible but rather is counter-normative and
often counter-intuitive. We suggest that making this process more common requires breaking it
down into essential activities e identifying failure, analyzing failure, and experimenting e in which
individuals and groups can engage. By reviewing examples from a variety of organizations and
industries where failures are being mined and put to good use through these activities, we seek to
demystify the potentially abstract ideal of learning from failure. We offer six actionable
recommendations, and argue that these recommendations are best implemented by reframing
managerial thinking, rather than by treating them as a checklist of separate actions.
In conclusion, leaders can draw on this conceptual foundation as they seize opportunities, craft
skills, and build routines, structures, and incentives to help their organizations enact these learning
processes. At the same time, we do not underestimate the challenge of tackling the psychological
and interpersonal barriers to this organizational learning process. As human beings, we are
socialized to distance ourselves from failures. Reframing failure from something associated with
shame and weakness to something associated with risk, uncertainty and improvement is a critical
first step on the learning journey.
Acknowledgements
We acknowledge the financial support of the Division of Research at Harvard Business School for
the research that gave rise to these ideas. We would also like to thank the LRP editor, as well as the
Special Issue editors and the anonymous reviewers for very helpful feedback.
References
1. M. Cannon and A. C. Edmondson, Confronting failure: Antecedents and consequences of shared beliefs
about failure in organizational work groups, Journal of Organizational Behavior 22, 161e177 (2001).
2. D. Vaughan, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, University
of Chicago Press, Chicago, IL (1996).
Biographies
Mark Cannon is an Assistant Professor of Leadership, Policy and Organizations and of Human and Organizational
Development at Vanderbilt University. He investigates barriers to learning in organizational settings, such as
positive illusions, individual and organizational defenses, and barriers to learning from failure. He has published
recently on executive coaching topics, including coaching leaders in transition and coaching interventions that
produce actionable feedback. His work has appeared in the Academy of Management Executive, Human Resource
Management, and Journal of Organizational Behavior. He received his Ph.D. in Organizational Behavior from
Harvard University. Vanderbilt University, Peabody #514, Nashville, TN 37203. Tel: (615) 343-2775 Fax: (615) 343-
7094 e-mail: mark.d.cannon@vanderbilt.edu
Amy C. Edmondson, Professor of Business Administration and Chair of the Doctoral Programs Committee at
Harvard Business School, studies teams in healthcare and other industries, and emphasizes the role of psychological
safety for enabling learning, change, and innovation in organizations. In 2003, she received the Cummings Award
from the Academy of Management OB division for outstanding achievement in early-mid career. Her recent article,
Why Hospitals Don’t Learn from Failures: Organizational and Psychological Dynamics That Inhibit System Change
(with A. Tucker), received the 2004 Accenture Award for a significant contribution to management practice.
Edmondson received her PhD in organizational behavior from Harvard University. Morgan Hall T-93, Harvard
Business School, Boston, MA 02163 Tel: (617) 495-6732 Fax: (617) 496-5265 email: aedmondson@hbs.edu