RMAS An Officer and A Problem Solver
RMAS An Officer and A Problem Solver
RMAS An Officer and A Problem Solver
No 6
© 2011. No part of this publication, except for short extracts, may be reproduced, stored in a
retrieval system, or transmitted in any form without the prior permission of the Royal Military
Academy Sandhurst.
ISBN: 978-1-907628-05-4
The views expressed in this paper are solely those of the author(s) and do not necessarily reflect
official thinking or policy of Her Majesty’s Government, the Ministry of Defence or the Royal
Military Academy Sandhurst.
Correspondence and enquiries about the Sandhurst Occasional Paper series should be addressed to:
The Senior Librarian, Central Library, Royal Military Academy Sandhurst Camberley, Surrey
GU15 4PQ
e-mail: senlibrarian@rmas.mod.uk
2
An Officer and a Problem Solver
This assessment (forged in the experience of Iraq and Afghanistan) has driven
an agenda of transformation across the Army. The aim is to make the changes that
are necessary to enable the Army to be the best of its size in the world. This is set out
in the vision of Commander Force Development and Training (FDT), the 3 star
Command that was established in 2010 to oversee the entire process of how officers
and soldiers are recruited, trained, managed and prepared for Operations:
1
Future Character of Conflict, Defence Concepts and Doctrine Centre, 2010.
3
“[An Army] founded on first rate thought, able to out-think as well as
outfight its enemies, quick to absorb lessons and to adjust practice,
procedures, doctrine, organisation and equipment faster than our
adversaries … One with leaders who are mentally agile, comfortable
with uncertainty, complexity, chaos, and change, trained to seize the
initiative and to exploit opportunities at the lowest level, culturally
and technologically aware, with the judgment and understanding to
work across the full breadth of the operating environment … One
prepared for war, today and in the future.”2
The terms we have emphasised in italics make clear just how important
problem solving and decision making skills are at every level of leadership. Simply
put, decision making is the keystone skill of leadership; in a practical sense, Officers
are commissioned to face and solve problems that will be in the best interests of their
soldiers and their nation. Some of these decisions will be routine and the solution
apparently obvious; some will be hugely complex and seemingly impossible to
resolve. But all will require a decision, the choice of a course of action –
remembering, of course, that choosing to take ‘no action’ is a decision too.
2
FDT Directive 2009/10
4
What makes for better decision making?
• The best problem solvers are aware of the processes of problem solving. Their
understanding of both the intuitive and effortful processes enables the best
problem solvers to iron out the creases in aspects of their thinking and to
choose the best strategy for the problem at hand; they know when to work
intuitively and when to stop and grind through the issue.
• Effective problem solvers review and reflect upon their decisions. Their
analysis, especially on what went well and why, helps inform and develop their
approach.
• They work on ‘educating’ their intuition; they don’t simply allow their
experience to be translated into ‘intuitive’ competence: they actively engage in
the process.
• They employ problem solving tools and techniques such the story model, role
playing, etc.
5
• They work to develop their emotional intelligence, by which is meant their
ability to understand their emotional responses to situations which are often to
key to understand why we have chosen a particular course of action. This is a
key element of handling the stress that can accompany problem solving.
• They are students of ‘human psychology’ in the broadest sense. Given that all
decisions we make involve people in some way it makes sense to understand
them as best as possible and not just to rely on our intuitions – remember that
one person’s ‘common sense’ can be another’s prejudice.
• The most creative and innovative thinkers relish ambiguity, the ‘shades of
grey’ in a problem.
• They have worked to develop their power of statistical reasoning. They will
often have to work with technical information that is incomplete or ambiguous;
for example, you are told a new weapon system is 10% better than the old
weapon system or that there is a 25% chance of rain tomorrow. Without a
sound grasp of statistical thinking with which to interrogate these claims, your
judgment might be one of faith rather than reason.
• They ensure that they exercise, sleep and eat well; a fit, well-nourished and
well-rested brain is considerably more effective than a tired one.
6
Thinking about Thinking
We can begin then by identifying the two basic ways of solving a problem: the
intuitive and the effortful. Fig 1 sets out the characteristics of both, their
shortcomings and where they might be best used and by whom.
7
Notice that intuition comes in two forms: naïve and educated. The intuitive
problem solving of a Platoon Commander with years of experience is of a different
order from untutored intuition. The challenge is to educate intuition. This can be done
in two ways. First, to use every relevant experience – say, in tactics – to unpick why a
certain decision has been made or why the problem solver felt drawn to a particular
course of action. Secondly, we can make use of the expert intuition in our midst: the
wealth of operational and other relevant experience that has been accumulated in the
Army over the past decade. We encourage Officer Cadets to use any opportunities
they get to encourage their Instructors to unpick their decisions for them. However,
this is no easy task; many experienced practitioners have become so ‘intuitive’ that
they find it difficult to lay bare their thinking in this way. There are organisational
culture issues too. The Army is culturally a ‘doing’ organisation which as
traditionally privileged action over reflection. A mentally agile Army will value
‘thinking about thinking’ and appreciate its role in developing its ability to make
better decisions both individually and collectively.
For example, ‘think about thinking’ makes us aware of the distinction between
different kinds of thinking: between close, analytical thinking, where we use logic
and facts, and ‘reflective’ thinking, where we use our experience of the world to
inform our decision. Good decisions usually rely on judging how the two sorts of
thinking inform the final decision. For example, an Infantry Officer in Afghanistan
may assess the ground and identify three feasible routes to an objective. While his
close, analytical thinking tells him that Route Bravo offers the best protection, access
etc., his reflective thinking tells him that the insurgents have seen another platoon in
the Coy use Route Bravo on an earlier occasion.
8
So how does RMAS prepare Officer Cadets as problem solvers and decision
makers?
Officer Cadets, of course, will be solving all sorts of problems and decisions
during their time at the Academy; as they organise themselves and their kit, they’ll
make thousands of decisions about how to prioritise their time and effort. There will
be plenty of practice, too, during lessons – both in the field and in the Hall of Studies
– where they’ll be set problems and questions they’ll have to resolve. But, crucially,
they’ll also examine the processes of problem solving too. As well as carrying out
Command Tasks, they’ll be asked to reflect on how they arrived at a decision. And,
of course, they will be introduced to the ‘7 Questions’, the problem solving tool that
the Army uses to inform the Estimate Process. Their proficiency in using the 7
Questions will develop in the intermediate term and by the time their competence in
basic tactics is assessed, they should have a reasonable grasp of it. Exercise
Normandy Scholar, where they study situations that Allied troops faced in Normandy
at the time of D-Day Landings, will ask them to use the 7 Questions to decide what
they might have done in that situation and to compare their solution with what really
happened.
The lessons in CABS aim to foreground the processes and practices of problem
solving and, in doing so, complement their practical lessons in the field. The CABS
lessons are set out in a logical, ordered manner; not because that’s how we always
solve problems, but simply to give the lessons a structure and shape.
The lessons are organised around a problem solving model (below) that sets
out the various elements that make up the problem solving process.
9
Creative Thinking
Decision
Comprehension Option Making
Problem space
space Generation Space
P S
R Psycho O
O -social L
factors 7Qs Options U
B T
(7Qs)
L I
O
E N
M (S)
Although these often do not happen in this order, for the sake of simplicity we have
identified them as follows:
This is where we make sense of the problem or problems before us. Crucial
here is how the problem is presented or ‘framed’. For example, we know that the
wording of problems can have an effect on how we might go about solving it. A plan
that will ‘save over 60%’ is often viewed as preferable to one that will ‘lose over a
third’ even though the result is the same. As well as the wording there are other
factors at play – the time available, the seriousness of the situation, the expectations
of those around the decision maker, and so on. The lesson is that we may need to stop
and think about how we comprehend the problem before we leap to a possible
solution. For example, does it matter that our intuitive solution is likely to be
predictable?
10
The Problem Space
This is the phase where we interrogate the evidence and navigate our way
through the problem. This is best accomplished using a problem solving tool. The 7
Questions is an excellent aid in this regard: it is designed to help the problem solver
ensure that the issue has been explored rigorously. In the CABS lessons we introduce
a critical thinking tool (RAVEN-C) which is useful in filtering through and assessing
ambiguous or incomplete evidence.
Option Generation
At this final stage the task is to weigh up and assess the options that have been
generated. It is wise here to use a formal decision making tool that will allow us to
identify the criteria against which the options are to be assessed and b) provide a
means of auditing your decision (ie, we will be able at a later date to explain and
justify our choice of option).
11
THE COMPREHENSION SPACE
Problem Types
Effective problem solving often hinges on recognising the type of problem that
is being faced. In general terms, there are four main problem types: simplistic,
deterministic, random and indeterminate (or ‘Wicked’):
1. Simplistic problems: where there is one and only one answer. For example,
who is the Chief of the Defence Staff?
3. Random problems: where there is only one answer, but there are a number of
possible correct answers. For example, who will win this intake’s Sword of
Honour?
12
Role of facts/algorithms
Role of
judgment
When faced with a problem it is useful to categorise and ask: ‘what kind of
problem is this?’ Once identified, we can apply the most appropriate problem solving
process. As the diagram shows, in both simplistic and deterministic problems, facts
and algorithms are applied; judgment is not. For example, there is no discussion
about how to carry out a VCP; you simply use the appropriate SOP. However,
random problems entail both facts and judgment. To return to the question of the
Sword of Honour winner, although it is possible that any of the senior term cadets
will win, pertinent evidence from performance, and reports etc., enables us to narrow
the field down considerably. In dealing with indeterminate problems, ‘facts’ are
likely to be rejected or contested by the various stakeholders making their use of little
value or, worse, counter-productive, particularly when those ‘facts’ are culturally
derived.
Of course, in the complex real world we inhabit, the ‘problem’ we are faced
with will, in all likelihood, be a combination of several problems and a helpful first
step can be to unravel the separate problems and identify the kind of problem they
are. This can be useful as we begin to solve them; for example, simple and
deterministic problems can be easily delegated leaving you with more time to
dedicate to indeterminate problems.
13
Psycho-social factors: heuristics and biases
The concept of cognitive biases was introduced by Amos Tversky and Daniel
Kahneman in 1972 and grew out of their experience of people's innumeracy, or
inability to reason intuitively with the greater orders of magnitude. They and their
colleagues demonstrated several replicable ways in which human judgments and
decisions differ from a rational logical model. They explained these differences in
terms of heuristics, rules which are simple for the brain to compute but introduce
systematic errors; for instance the ‘availability heuristic’, when the ease with which
something comes to mind is used to indicate how often (or how recently) it has been
encountered.
Some biases reflect a subject's motivation; for example, the desire for a
positive self-image leading to ‘egocentric bias’ and the avoidance of unpleasant
cognitive dissonance. Other biases are due to the particular way the brain perceives,
14
forms memories and makes judgments. These are some of the most common
heuristics and biases:
• The fundamental attribution error can operate in the reverse direction: we over
emphasize the role and power of situational influences on ourselves and under
estimate personality-based explanations.
15
These heuristics and biases have affected and will continue to affect military decision
makers3 and awareness of how they influence us is essential to anyone who would
seek to improve their decision making.
Reputation
3
Williams, BS (2010) ‘Heuristics and Biases in Military Decision Making’ Military Review Sep/Oct 2010, Vol
90 Issue 5, p 40-52
16
Ability to observe
• Did the source really see the event? Sources often imply to others or
even believe themselves that they actually saw an event, but when
questioned in depth it transpires that they did not.
Vested Interests
• What are the interests of the source in this situation? These interests
can be difficult to determine and it is easy to ignore them or to jump to
a cynical, and perhaps false, conclusion about them. Nevertheless,
whilst keeping an open mind, this question does require an answer.
• Vested interest to tell the truth – having a motive to tell the truth;
for example, because lying might endanger the source’s professional
reputation or religious or moral scruples.
17
Expertise and experience
Neutrality
Corroboration
Wicked Problems
18
Chief’s wish to see the Army ‘training as we need to fight in Afghanistan’4. In that
theatre political, moral and cultural ambiguity often prevails, competing interests
characterise issues, and outcomes are unpredictable – perceptions and worldviews
collide with unexpected results. ‘Wicked problems’ was a label Rittel and Webber
gave to problems where there ‘… is no consensus on what the problems are, let alone
how to resolve them.’”5 They claimed that such problems cannot be resolved with
traditional analytical approaches. It is clear that where such problems predominate,
the success of both kinetic and non-kinetic efforts relies on understanding their nature
and finding alternative ways of solving them.
2. Wicked problems have no stopping rule. Since you cannot define the problem,
it is difficult to tell when it is resolved. The problem solving process ends when
resources are depleted, stakeholders lose interest or political realities change.
19
6. Wicked problems do not have a well-described set of potential solutions.
Various stakeholders will have differing views of acceptable solutions. It is a
matter of judgment as to when enough potential solutions have emerged and
which should be pursued.
10. Those who seek to solve these problems have no right to be wrong. A scientist
is expected to formulate a hypothesis, which may or may not be supportable by
evidence. Military officers do not have such a luxury: they are expected to get
things right.
20
Tackling wicked problems
To quote Rittel & Webber: “The classical systems approach … is based on the
assumption that a … project can be organized into distinct phases: ‘understand the
problems’, ‘gather information,’ ‘synthesize information’, ‘work out solutions’ and
the like. For wicked problems, however, this type of scheme does not work. One
cannot understand the problem without knowing about its context; one cannot
meaningfully search for information without the orientation of a solution concept;
one cannot first understand, then solve.”
a. Dynamic complexity. Dynamic complexity is low when cause and effect are
nearby (physically or temporally); e.g., ‘the car won’t start’. Dynamic complexity
is high when cause and effect are far apart; e.g., the effects of how we treat the
environment today on future generations.
21
c. Social complexity. Social complexity is low when all the people have the
same assumptions, values, rationales etc.; e.g., a group of Officer Cadets facing a
command task. Social complexity is high when the people involved have different
assumptions, values, worldview, etc.; e.g., a military operation in an unfamiliar
and alien society.
22
OPTION GENERATION
Creative thinking
1. The situations you will encounter (both professionally and personally) will be
unique to your experience. ‘Off the shelf’, ‘template’ solutions will often not
produce the most appropriate or most effective answers. You will need to devise
solutions and courses of action to suit the circumstances: in other words, to
generate novel solutions to novel situations. And that requires creative thinking.
• Random input
• Problem reversal
• Lateral thinking
• Forced relationships/analogy
• Metaphorical thinking
• Unconscious problem solving
• Fuzzy thinking
23
However, if any of these techniques are to work, we have to understand and
overcome some of the barriers to creativity and innovation.
Risk Aversion
Group Pressure
24
Personal Disposition
It is important to see creative thinking as something we all do, all the time, as
we react to unique circumstances of our lives. As such it’s something we all do – not
just the preserve of ‘creative people’. However, like all skills, some people are more
skilled than others, so we need to work on it if we want to be the best we can be.
Creative thinking skills go hand in hand with other thinking skills such as ‘rational
thinking’ skills (the sort of close critical analysis we engage in when we interrogate
evidence) and ‘reflective thinking’ skills (when we take our experience to a problem
or situation and use that to predict what might happen if we make a particular
decision or choose a particular course of action). Developing those skills should be
part of ongoing personal and professional development.
25
a. set out the process and thus create an auditable trail which can allow the
decision maker to explain their decision at a later date;
b. codify and quantify the inescapable subjectivity that will play a part in any
deliberations;
c. bind a group of individuals into a single process to which they all contribute;
and, perhaps most importantly,
d. offer a system in which each element is paired with another and each pair is
assessed using a single criterion at a time – the human mind can only make a
series of single decisions. So, for example, if you were asked to choose between
taking Soldier A, Soldier B or Soldier C on a mission you would be better advised
to first indentify the key criteria – say marksmanship and endurance – and then
consider each pair (A and B, A and C, B and C) separately against each single
criterion, rather than try to assess all three on both dimensions simultaneously.
There is a final issue to consider here and in many ways it is the most
important: the ethical dimension of decision making. This is complex territory that
raises as many questions as it answers. How might ethics and values be incorporated
into a model of problem solving and decision making? Is there a set of moral
principles that we take to every situation or do we derive the most ethical solution
from weighing up the rights and wrongs of specific issues at hand? Philosophers
often set ethical dilemmas to test these questions. Consider these famous examples:
a. You are travelling in a remote part of the world alone and unarmed. You
stumble into a small village to find a firing squad about to execute ten children
aged between 7 and 8 years. You express your horror and the commander of the
26
firing squad offers to spare 9 children as long as you choose the one that should be
executed. Do take up the offer? And if you do, how do you choose?
b. A runaway train is travelling down a disused rail line. It will run into and kill
10 people who are picnicking on the line. You can prevent this by changing the
course of the train to a side line. However, there are two workers on the side line
who will be killed if the train is diverted. What do you do?
c. A man buys a chicken at a supermarket. He takes it home and has sex with it.
He then cooks and eats the chicken. Explain why this is morally wrong.
That final example is drawn from the work of Jonathan Haidt, Professor of Social
Psychology at the University of Virginia, who runs www.yourmorals.org, a website
dedicated to exploring the relationship between morals and emotions.
The lessons in CABS and elsewhere in the Academy will, we hope, help to
develop the Officer Cadets’ problem solving and decision making, but there is much
that they will need to resolve personally and wider issues that the Army needs to
address collectively:
• How do we prepare men and women to cope with environmental factors – such
as stress or the need for quick decisions – and how will these affect their
problem solving?
27
• How do we incorporate ethics and values into problem solving and decision
making?
• What effect do personal and organisational attitudes towards risk have on the
decisions that individuals make?
Further reading
Baron, J (1985) Rationality and Intelligence New York: Cambridge University Press
Baron, J (2008) Thinking and Deciding New York: Cambridge University Press
Dawes, R (2001) Everyday Irrationality Colorado: Westview Press
Gilovich, T, Griffin, D and Kahneman, D (eds) (2002) Heuristics and Biases: The Psychology of
Intuitive Judgment New York: Cambridge University Press
Haidt, J (2001). The emotional dog and its rational tail: A social intuitionist approach to moral
judgment. Psychological Review, 108, 814-834
Kahane, A (2007) Solving tough problems: an open way of talking, listening, and creating new
realities California: Berrett-Koehler
Kingman, D and Tversky, A (1972), "Subjective probability: A judgment of representativeness",
Cognitive Psychology 3: 430–454
Klein, G (2003) The Power of Intuition New York: Doubleday
Morgan, J (1998) The Thinker’s Toolkit New York: Three Rivers Press
Plous, S (1993) The Psychology of Judgment and Decision Making New York: McGraw-Hill
Rittel, H and Webber, M (1973) ‘Dilemmas in a General Theory of Planning’ Policy Sciences Vol
4 pp 155-169
Simon, HA (1956) ‘Rational choice and the structure of the environment’ Psychological Review
Vol 63 No 2, 129-138
Williams, BS (2010) ‘Heuristics and Biases in Military Decision Making’ Military Review Sep/Oct
2010, Vol 90 Issue 5, p 40-52
28