Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

RMAS An Officer and A Problem Solver

Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Sandhurst Occasional Papers

No 6

‘An Officer and a Problem Solver’:


Developing Problem Solving and Thinking
Skills in Officer Cadets at Sandhurst

Department of Communication and Applied


Behavioural Science, RMAS

Royal Military Academy Sandhurst


2011
The Department of Communication and Applied Behavioural Science, RMAS

Growing out of an existing department of management studies,


Communication and Applied Behavioural Science (CABS) was established in 2008.
Its aim is to develop the Officer Cadets’ competence in four key areas of leadership:
Motivation and Teambuilding; Communication and Influence, Problem solving and
Decision making; the Leading and Management of Change.

The Department’s approach is a ‘human factors’ one: it uses the techniques,


knowledge and insights that current (and classic) research in the Behavioural
Sciences offers – from ways of understanding ourselves to the science of influencing
others – to inform what might be termed ‘the practical psychology of leadership’. The
lessons are mostly experiential: the Officer Cadets practise their media skills in
interviews and conferences, they practise their negotiation skills in a variety of
exercises; they not only learn about motivation but go out and survey others and learn
about their perceptions. This practical approach is balanced by an ethos of ‘learning
about learning and ‘thinking about thinking’ which underpins the entire CABS
syllabus.

SANDHURST OCCASIONAL PAPER NO 6


Series Editor: Sean McKnight (Director of Studies, RMAS)

© 2011. No part of this publication, except for short extracts, may be reproduced, stored in a
retrieval system, or transmitted in any form without the prior permission of the Royal Military
Academy Sandhurst.

ISBN: 978-1-907628-05-4

The views expressed in this paper are solely those of the author(s) and do not necessarily reflect
official thinking or policy of Her Majesty’s Government, the Ministry of Defence or the Royal
Military Academy Sandhurst.
Correspondence and enquiries about the Sandhurst Occasional Paper series should be addressed to:

The Senior Librarian, Central Library, Royal Military Academy Sandhurst Camberley, Surrey
GU15 4PQ

e-mail: senlibrarian@rmas.mod.uk

2
An Officer and a Problem Solver

Sandhurst is preparing Officer Cadets to serve in a complex, changing world in


which the nature of warfare, and with it, the kind of adversaries they might face
maybe radically different from that which went before:

“Conflict follows a natural cycle of adaptation and response, but its


evolution is neither linear, nor constant … [W]hile we have adapted
well to some of the demands of current operations there is a growing
sense that aspects of Defence are out of phase and lagging; we are still
optimised for the conflicts that we fought in the past. Future conflict
will be increasingly hybrid in character. This is not code for
insurgency or stabilisation; it is about a change in the mindset of our
adversaries, who are aiming to exploit our weaknesses using a wide
variety of high-end and low-end asymmetric techniques. These forms
of conflict are transcending our conventional understanding of what
equates to irregular and regular military activity; the ‘conflict
paradigm’ has shifted and we must adapt our approaches if we are to
succeed.”1

This assessment (forged in the experience of Iraq and Afghanistan) has driven
an agenda of transformation across the Army. The aim is to make the changes that
are necessary to enable the Army to be the best of its size in the world. This is set out
in the vision of Commander Force Development and Training (FDT), the 3 star
Command that was established in 2010 to oversee the entire process of how officers
and soldiers are recruited, trained, managed and prepared for Operations:

1
Future Character of Conflict, Defence Concepts and Doctrine Centre, 2010.

3
“[An Army] founded on first rate thought, able to out-think as well as
outfight its enemies, quick to absorb lessons and to adjust practice,
procedures, doctrine, organisation and equipment faster than our
adversaries … One with leaders who are mentally agile, comfortable
with uncertainty, complexity, chaos, and change, trained to seize the
initiative and to exploit opportunities at the lowest level, culturally
and technologically aware, with the judgment and understanding to
work across the full breadth of the operating environment … One
prepared for war, today and in the future.”2

The terms we have emphasised in italics make clear just how important
problem solving and decision making skills are at every level of leadership. Simply
put, decision making is the keystone skill of leadership; in a practical sense, Officers
are commissioned to face and solve problems that will be in the best interests of their
soldiers and their nation. Some of these decisions will be routine and the solution
apparently obvious; some will be hugely complex and seemingly impossible to
resolve. But all will require a decision, the choice of a course of action –
remembering, of course, that choosing to take ‘no action’ is a decision too.

This paper is derived from work in the Department of Communication and


Behavioural Science (CABS) at RMAS. We have been developing an approach that
integrates the hands on practice of problem solving with the reflection that is the key
to developing these critical skills. We hope that, in a small way, our work in this area
can support the effort to create an Army ‘able to out-think as well as out fight its
enemies’

2
FDT Directive 2009/10

4
What makes for better decision making?

At RMAS we appreciate that our cadets come to us as accomplished problem


solvers, evidenced by the personal, educational and professional successes they have
enjoyed. The challenge for us is to help them become even better. In this complex
area, what does ‘good’ look like? What does current research tell us about the best
decision makers? It tells us the following:

• The best problem solvers are aware of the processes of problem solving. Their
understanding of both the intuitive and effortful processes enables the best
problem solvers to iron out the creases in aspects of their thinking and to
choose the best strategy for the problem at hand; they know when to work
intuitively and when to stop and grind through the issue.

• Effective problem solvers review and reflect upon their decisions. Their
analysis, especially on what went well and why, helps inform and develop their
approach.

• They work on ‘educating’ their intuition; they don’t simply allow their
experience to be translated into ‘intuitive’ competence: they actively engage in
the process.

• They employ problem solving tools and techniques such the story model, role
playing, etc.

5
• They work to develop their emotional intelligence, by which is meant their
ability to understand their emotional responses to situations which are often to
key to understand why we have chosen a particular course of action. This is a
key element of handling the stress that can accompany problem solving.

• They are students of ‘human psychology’ in the broadest sense. Given that all
decisions we make involve people in some way it makes sense to understand
them as best as possible and not just to rely on our intuitions – remember that
one person’s ‘common sense’ can be another’s prejudice.

• The most creative and innovative thinkers relish ambiguity, the ‘shades of
grey’ in a problem.

• They have worked to develop their power of statistical reasoning. They will
often have to work with technical information that is incomplete or ambiguous;
for example, you are told a new weapon system is 10% better than the old
weapon system or that there is a 25% chance of rain tomorrow. Without a
sound grasp of statistical thinking with which to interrogate these claims, your
judgment might be one of faith rather than reason.

• They ensure that they exercise, sleep and eat well; a fit, well-nourished and
well-rested brain is considerably more effective than a tired one.

6
Thinking about Thinking
We can begin then by identifying the two basic ways of solving a problem: the
intuitive and the effortful. Fig 1 sets out the characteristics of both, their
shortcomings and where they might be best used and by whom.

Type Characteristics Possible Good for


shortcomings

Intuitive (or Informed by Especially sensitive Expert decision


naturalistic) ‘common sense’, to psycho-social makers; where the
(a) Naive intuition and experience in factors (cognitive time is limited.
other contexts. biases, group
pressure, etc).

b) ‘Educated Informed by ‘Satisficing’: ie, Situations that don’t


intuition’: e.g., specific expert settling for a good merit a time
recognition-primed strategies – mental enough solution consuming effort
decision making (cf simulation, rather than striving but which must be
Klein,2003) prototypical models; for the best solution reflected on and
expectancies; cues; (Simon, 1956) learnt from
singular evaluation
approach

Effortful (or Informed by Unnecessary effort; Novice decision


rational) rational, logical slow makers; all decision
(cf Baron, 1985) thinking making use makers where they
of formal problem have the time and
solving tools: eg, the decision is
the 7 Questions deemed important
enough to warrant
this approach

Fig 1 Intuitive and effortful thinking

7
Notice that intuition comes in two forms: naïve and educated. The intuitive
problem solving of a Platoon Commander with years of experience is of a different
order from untutored intuition. The challenge is to educate intuition. This can be done
in two ways. First, to use every relevant experience – say, in tactics – to unpick why a
certain decision has been made or why the problem solver felt drawn to a particular
course of action. Secondly, we can make use of the expert intuition in our midst: the
wealth of operational and other relevant experience that has been accumulated in the
Army over the past decade. We encourage Officer Cadets to use any opportunities
they get to encourage their Instructors to unpick their decisions for them. However,
this is no easy task; many experienced practitioners have become so ‘intuitive’ that
they find it difficult to lay bare their thinking in this way. There are organisational
culture issues too. The Army is culturally a ‘doing’ organisation which as
traditionally privileged action over reflection. A mentally agile Army will value
‘thinking about thinking’ and appreciate its role in developing its ability to make
better decisions both individually and collectively.

For example, ‘think about thinking’ makes us aware of the distinction between
different kinds of thinking: between close, analytical thinking, where we use logic
and facts, and ‘reflective’ thinking, where we use our experience of the world to
inform our decision. Good decisions usually rely on judging how the two sorts of
thinking inform the final decision. For example, an Infantry Officer in Afghanistan
may assess the ground and identify three feasible routes to an objective. While his
close, analytical thinking tells him that Route Bravo offers the best protection, access
etc., his reflective thinking tells him that the insurgents have seen another platoon in
the Coy use Route Bravo on an earlier occasion.

8
So how does RMAS prepare Officer Cadets as problem solvers and decision
makers?

Officer Cadets, of course, will be solving all sorts of problems and decisions
during their time at the Academy; as they organise themselves and their kit, they’ll
make thousands of decisions about how to prioritise their time and effort. There will
be plenty of practice, too, during lessons – both in the field and in the Hall of Studies
– where they’ll be set problems and questions they’ll have to resolve. But, crucially,
they’ll also examine the processes of problem solving too. As well as carrying out
Command Tasks, they’ll be asked to reflect on how they arrived at a decision. And,
of course, they will be introduced to the ‘7 Questions’, the problem solving tool that
the Army uses to inform the Estimate Process. Their proficiency in using the 7
Questions will develop in the intermediate term and by the time their competence in
basic tactics is assessed, they should have a reasonable grasp of it. Exercise
Normandy Scholar, where they study situations that Allied troops faced in Normandy
at the time of D-Day Landings, will ask them to use the 7 Questions to decide what
they might have done in that situation and to compare their solution with what really
happened.

Problem Solving in CABS

The lessons in CABS aim to foreground the processes and practices of problem
solving and, in doing so, complement their practical lessons in the field. The CABS
lessons are set out in a logical, ordered manner; not because that’s how we always
solve problems, but simply to give the lessons a structure and shape.

The lessons are organised around a problem solving model (below) that sets
out the various elements that make up the problem solving process.

9
Creative Thinking
Decision
Comprehension Option Making
Problem space
space Generation Space

P S
R Psycho O
O -social L
factors 7Qs Options U
B T
(7Qs)
L I
O
E N
M (S)

Fig 2 Model of rational thinking

Although these often do not happen in this order, for the sake of simplicity we have
identified them as follows:

The Comprehension Space

This is where we make sense of the problem or problems before us. Crucial
here is how the problem is presented or ‘framed’. For example, we know that the
wording of problems can have an effect on how we might go about solving it. A plan
that will ‘save over 60%’ is often viewed as preferable to one that will ‘lose over a
third’ even though the result is the same. As well as the wording there are other
factors at play – the time available, the seriousness of the situation, the expectations
of those around the decision maker, and so on. The lesson is that we may need to stop
and think about how we comprehend the problem before we leap to a possible
solution. For example, does it matter that our intuitive solution is likely to be
predictable?

10
The Problem Space

This is the phase where we interrogate the evidence and navigate our way
through the problem. This is best accomplished using a problem solving tool. The 7
Questions is an excellent aid in this regard: it is designed to help the problem solver
ensure that the issue has been explored rigorously. In the CABS lessons we introduce
a critical thinking tool (RAVEN-C) which is useful in filtering through and assessing
ambiguous or incomplete evidence.

Option Generation

The step is to generate options (note that in using an intuitive method we


would leap immediately to this step and so fail to rigorously interrogate the evidence.
If the problem solver was not expert this would almost certainly mean that some
options would never be considered). In the option generation phase the solver should
switch from logical, rational thinking to a more creative, innovative style – the idea
here is to generate a range of options that meet the objective. The ability to think
creatively will almost certainly be the key to success in situations where the old ways
simply do not work any longer.

The Decision Making Space

At this final stage the task is to weigh up and assess the options that have been
generated. It is wise here to use a formal decision making tool that will allow us to
identify the criteria against which the options are to be assessed and b) provide a
means of auditing your decision (ie, we will be able at a later date to explain and
justify our choice of option).

11
THE COMPREHENSION SPACE

Problem Types

Effective problem solving often hinges on recognising the type of problem that
is being faced. In general terms, there are four main problem types: simplistic,
deterministic, random and indeterminate (or ‘Wicked’):

1. Simplistic problems: where there is one and only one answer. For example,
who is the Chief of the Defence Staff?

2. Deterministic problems: where the answer is arrived at by the application of a


formula, algorithm or protocol. For example, the circumference of a circle is
found by applying a certain formula. In a military context, an SOP is such a
formula – we carry out certain tasks according to the SOP.

3. Random problems: where there is only one answer, but there are a number of
possible correct answers. For example, who will win this intake’s Sword of
Honour?

4. Indeterminate problems: where the answer itself is complex, hard to identify or


changes in time. For example, what is ‘success’ in a particular military operation?
Answering such a question means taking into account a huge range of factors
including how others see the issue, how the issue has changed and how your
earlier decisions and actions have themselves affected the issue. To use Rittel and
Webber’s terminology, these are ‘wicked problems’.

12
Role of facts/algorithms

Role of
judgment

Simplistic Deterministic Random Indeterminate

Fig 3 Types of problems and the role of facts and judgment

When faced with a problem it is useful to categorise and ask: ‘what kind of
problem is this?’ Once identified, we can apply the most appropriate problem solving
process. As the diagram shows, in both simplistic and deterministic problems, facts
and algorithms are applied; judgment is not. For example, there is no discussion
about how to carry out a VCP; you simply use the appropriate SOP. However,
random problems entail both facts and judgment. To return to the question of the
Sword of Honour winner, although it is possible that any of the senior term cadets
will win, pertinent evidence from performance, and reports etc., enables us to narrow
the field down considerably. In dealing with indeterminate problems, ‘facts’ are
likely to be rejected or contested by the various stakeholders making their use of little
value or, worse, counter-productive, particularly when those ‘facts’ are culturally
derived.

Of course, in the complex real world we inhabit, the ‘problem’ we are faced
with will, in all likelihood, be a combination of several problems and a helpful first
step can be to unravel the separate problems and identify the kind of problem they
are. This can be useful as we begin to solve them; for example, simple and
deterministic problems can be easily delegated leaving you with more time to
dedicate to indeterminate problems.

13
Psycho-social factors: heuristics and biases

There is a human tendency to make systematic errors in judgment, knowledge,


and reasoning. In psychology these are called ‘cognitive biases’. Such biases can
result from the fact that we use mental shortcuts called ‘heuristics’. They include
errors in statistical judgment, attribution, and memory. Cognitive biases are a
common outcome of human thought, and often drastically skew the reliability of
evidence.

The concept of cognitive biases was introduced by Amos Tversky and Daniel
Kahneman in 1972 and grew out of their experience of people's innumeracy, or
inability to reason intuitively with the greater orders of magnitude. They and their
colleagues demonstrated several replicable ways in which human judgments and
decisions differ from a rational logical model. They explained these differences in
terms of heuristics, rules which are simple for the brain to compute but introduce
systematic errors; for instance the ‘availability heuristic’, when the ease with which
something comes to mind is used to indicate how often (or how recently) it has been
encountered.

Some biases affect decision-making, where the desirability of options has to be


considered (e.g., the ‘sunk cost fallacy’). Others such as ‘illusory correlation’ affect
judgment of how likely something is, or of whether one thing is the cause of another.
A distinctive class of biases affects memory, such as the ‘consistency bias’
(remembering one's past attitudes and behaviour as more similar to one's present
attitudes).

Some biases reflect a subject's motivation; for example, the desire for a
positive self-image leading to ‘egocentric bias’ and the avoidance of unpleasant
cognitive dissonance. Other biases are due to the particular way the brain perceives,

14
forms memories and makes judgments. These are some of the most common
heuristics and biases:

• ‘Framing’ by using a too-narrow approach and description of the situation or


issue.

• ‘Hindsight bias’, sometimes called the "I-knew-it-all-along" effect, is the


inclination to see past events as being predictable.

• ‘Fundamental attribution error’ is the tendency for people to over emphasize


personality-based explanations for behaviours observed in others while under
emphasizing the role and power of situational influences on the same
behaviour.

• The fundamental attribution error can operate in the reverse direction: we over
emphasize the role and power of situational influences on ourselves and under
estimate personality-based explanations.

• The ‘Confirmation bias’ is the tendency to search for or interpret information


in a way that confirms one's preconceptions; this is related to the concept of
cognitive dissonance.

• The ‘Self-serving bias’ is the tendency to claim more responsibility for


successes than failures. It may also manifest itself as a tendency for people to
evaluate ambiguous information in a way beneficial to their interests.

• The ‘Belief bias’ is when one's evaluation of the logical strength of an


argument is biased by one’s belief in the truth or falsity of the conclusion.

15
These heuristics and biases have affected and will continue to affect military decision
makers3 and awareness of how they influence us is essential to anyone who would
seek to improve their decision making.

THE PROBLEM SPACE

Sifting through and weighing up evidence

It is very helpful when assessing evidence to have a model to guide our


thinking. One model is based on the acronym RAVEN-C. There will be a variety of
situations where we are faced with incomplete and ambiguous information – be it a
series of G2 feeds on the ground on Ops, or a collection of soldiers giving different
versions of what happened on Saturday night in the local town. The RAVEN-C
model does not provide a hard and fast way of telling truth from falsehood, but it
does offer a means of assessing credibility that will help the decision maker arrive at
a judgment. At the very least, it foregrounds ‘thinking about thinking’ and provides a
cognitive ‘audit trail’.

THE RAVEN-C MODEL:

Reputation

• Think about the credibility of a source’s claim. Clearly it is


strengthened or weakened by knowledge of past performance or
character. However, exercise caution; although past behaviour is the
best predictor of future behaviour it cannot be relied on entirely.

3
Williams, BS (2010) ‘Heuristics and Biases in Military Decision Making’ Military Review Sep/Oct 2010, Vol
90 Issue 5, p 40-52

16
Ability to observe

• Did the source really see the event? Sources often imply to others or
even believe themselves that they actually saw an event, but when
questioned in depth it transpires that they did not.

• Eye witnesses provide evidence based on first-hand experience. Be


very suspicious, though, if all the eyewitness accounts are the same,
since everyone sees different parts of an event. Too much similarity in
the accounts suggests collusion, intentional or unintentional.

Vested Interests

• What are the interests of the source in this situation? These interests
can be difficult to determine and it is easy to ignore them or to jump to
a cynical, and perhaps false, conclusion about them. Nevertheless,
whilst keeping an open mind, this question does require an answer.

• Vested interest to lie – having a motive to lie out of self-interest;


this might be for self-gain or to avoid losing something, such as a job
or friend.

• Vested interest to tell the truth – having a motive to tell the truth;
for example, because lying might endanger the source’s professional
reputation or religious or moral scruples.

17
Expertise and experience

• This is about having skills, experience or training that help interpret


the situation correctly. The evidence of a relevant expert has more
credibility than that of non-experts.

• However, experts can get things wrong, so their evidence must be


subject to close scrutiny by other experts and by lay people too. Rarely
should expertise be accepted uncritically and relied on without other
supporting evidence.

Neutrality

• How neutral is the evidence source? A fully neutral source has no


involvement with the issue.

• How biased is the source? Could the source be protecting a friend


or blaming someone the person doesn’t like for subconscious reasons?
The source may be unaware of its own biases.

Corroboration

• Corroboration/conflict – where more than one source of evidence


supports the same conclusion, or where two sources conflict with each
other.

Wicked Problems

The concept of ‘Wicked Problems’ (referred to above as ‘indeterminate


problems’) and how to go about solving them lies at the heart of Commander-in-

18
Chief’s wish to see the Army ‘training as we need to fight in Afghanistan’4. In that
theatre political, moral and cultural ambiguity often prevails, competing interests
characterise issues, and outcomes are unpredictable – perceptions and worldviews
collide with unexpected results. ‘Wicked problems’ was a label Rittel and Webber
gave to problems where there ‘… is no consensus on what the problems are, let alone
how to resolve them.’”5 They claimed that such problems cannot be resolved with
traditional analytical approaches. It is clear that where such problems predominate,
the success of both kinetic and non-kinetic efforts relies on understanding their nature
and finding alternative ways of solving them.

According to Rittel and Webber, wicked problems have ten characteristics:

1. There is no definitive formulation of the problem. Formulating the problem


and the solution are essentially the same thing. Each attempt at creating a solution
changes the understanding of the problem.

2. Wicked problems have no stopping rule. Since you cannot define the problem,
it is difficult to tell when it is resolved. The problem solving process ends when
resources are depleted, stakeholders lose interest or political realities change.

3. Solutions to wicked problems are not true-or-false but good-or-bad. Since


there are no unambiguous criteria for deciding if the problem is resolved, getting
all stakeholders to agree that a resolution is ‘good enough’ can be a challenge.

4. There is no immediate and no ultimate test of a solution to a wicked problem.


Solutions to wicked problems generate waves of consequences, and it is
impossible to know how all of the consequences will eventually play out.

5. Every implemented solution to a wicked problem has consequences. Once


your action has been taken it cannot be taken back and will, in turn, affect the
problem.
4
FDT Directive 2009/10
5
Rittel, H and Webber, M (1973) ‘Dilemmas in a General Theory of Planning’ Policy Sciences Vol 4 pp 155-
169

19
6. Wicked problems do not have a well-described set of potential solutions.
Various stakeholders will have differing views of acceptable solutions. It is a
matter of judgment as to when enough potential solutions have emerged and
which should be pursued.

7. Every wicked problem is essentially unique. There are no ‘classes’ of solutions


that can be applied to a specific case. Rittel and Weber advise that “part of the art
of dealing with wicked problems is the art of not knowing too early what type of
solution to apply.”

8. Every wicked problem can be considered a symptom of another problem. A


wicked problem is a set of interlocking issues and constraints which change over
time, embedded in a dynamic social context.

9. The causes of a wicked problem can be explained in numerous ways. There


are many stakeholders who will have various and changing ideas about what
might be a problem, what might be causing it, and how to resolve it.

10. Those who seek to solve these problems have no right to be wrong. A scientist
is expected to formulate a hypothesis, which may or may not be supportable by
evidence. Military officers do not have such a luxury: they are expected to get
things right.

Recognizing wicked problems

How might you identify a wicked problem? A key indicator is divergence. If


requirements are volatile, constraints keep changing, stakeholders can’t agree and the
target is constantly moving, in all likelihood you are dealing with a wicked problem.
If considerable time and effort has been spent, but there isn’t much to show for it,
there is probably a wicked problem lurking somewhere.

20
Tackling wicked problems

To quote Rittel & Webber: “The classical systems approach … is based on the
assumption that a … project can be organized into distinct phases: ‘understand the
problems’, ‘gather information,’ ‘synthesize information’, ‘work out solutions’ and
the like. For wicked problems, however, this type of scheme does not work. One
cannot understand the problem without knowing about its context; one cannot
meaningfully search for information without the orientation of a solution concept;
one cannot first understand, then solve.”

The appropriate way to tackle wicked problems is to discuss them. Consensus


emerges through the process of laying out alternative understandings of the problem,
competing interests, priorities and constraints. The application of more formal
analysis tools is impossible before the problem can be articulated in a concise, agreed
upon, well-bounded manner.

Wicked problems are usually resolved through discussion, consensus,


iterations, and accepting change as a normal part of the process. It is very useful to
consider how complex the problem is and the type of complexity it exhibits.
Complexity might be thought of in three ways: dynamic, generative and social.

a. Dynamic complexity. Dynamic complexity is low when cause and effect are
nearby (physically or temporally); e.g., ‘the car won’t start’. Dynamic complexity
is high when cause and effect are far apart; e.g., the effects of how we treat the
environment today on future generations.

b. Generative Complexity. Generative Complexity is low when key aspects of


the future are predictable; e.g., the firm will continue to make the same widgets in
the same way for the same the customer. Generative Complexity is high when the
future is unpredictable; e.g., the firm is changing hands, moving production, etc.

21
c. Social complexity. Social complexity is low when all the people have the
same assumptions, values, rationales etc.; e.g., a group of Officer Cadets facing a
command task. Social complexity is high when the people involved have different
assumptions, values, worldview, etc.; e.g., a military operation in an unfamiliar
and alien society.

Problems that are of low complexity can be solved simply by piece-meal,


backward looking, and authoritarian means. (Deal with the problem logically; one
thing at time; what worked before will work now; “listen to me; you know I know
what I’m doing”.)

By contrast, high complexity problems require systematic, emergent and


participatory approaches. (Try to understand the relationship between the context and
the problem, and the problem and the context; as issues reveal themselves be
prepared to alter course and act differently; give ownership of the problem to as many
relevant perspectives as possible; talk openly, listen reflectively.)

These general approaches spawn techniques such as ‘scenario planning’. This


technique is for use with a range of stakeholders who have a history of disagreement
and whose perception of the issue is likely to be very different. The technique is to
place the stakeholders together and to talk through a range of possible futures. The
method is for one stakeholder to propose an action and for others to respond by
saying how they would respond and so on. When one possible ‘future’ has been
explored, another stakeholder begins another scenario by proposing an action. The
aim is to expose and ultimately accommodate different perspectives and agendas. A
variation is ‘story boarding’. This method is done without stakeholders and entails a
ROC (Rehearsal of Concept) drill type running through the scenario. The key here is
to challenge assumptions, be they cultural, moral, political etc.

22
OPTION GENERATION

Creative thinking

Creative thinking is commonly thought of as something that ‘creative’ people


do and we associate it with artists and the like. But the fact is that any problem solver
needs to be able to use and draw upon creative thinking skills. This is for two
reasons:

1. The situations you will encounter (both professionally and personally) will be
unique to your experience. ‘Off the shelf’, ‘template’ solutions will often not
produce the most appropriate or most effective answers. You will need to devise
solutions and courses of action to suit the circumstances: in other words, to
generate novel solutions to novel situations. And that requires creative thinking.

2. As an Army Officer there will be circumstances in which innovation and


creativity will be especially relevant, particularly where you are seeking to gain an
advantage over an adversary. The obvious course of action will be obvious to the
adversary too.

There are a number of techniques that can encourage us to think creatively


such as:

• Random input
• Problem reversal
• Lateral thinking
• Forced relationships/analogy
• Metaphorical thinking
• Unconscious problem solving
• Fuzzy thinking

23
However, if any of these techniques are to work, we have to understand and
overcome some of the barriers to creativity and innovation.

Risk Aversion

It is perfectly understandable and reasonable that aspects of military decision-


making exhibit an aversion to risk. With the stakes so potentially high, decisions have
to be thought through carefully and the risks weighted up judiciously. In practice this
can mean defaulting to the ‘safe’ tried and tested option. In many cases this will,
indeed, be the correct course of action. However, there will be other situations where
innovation and creativity should be given a chance. If you examine the model of
rational thinking shown in fig 2, you will see that creativity is not being
recommended at every stage in the process. Rather it is recommended at one
particular stage: where the problem-solver has analysed the evidence and is now
generating a range of possible solutions. Once generated, all options are then
rigorously assessed against the criteria that any solution must meet. By adopting such
strategies, we can encourage and draw upon creative thinking skills while staying
completely focussed on what is practical and effective.

Group Pressure

It is easy to fall prey to forces of conformity and to refrain from making


suggestions or developing ideas that you fear the group might disapprove of or not
take seriously. The fact is that groups where conformity suppresses the generation of
ideas have a tendency to produce, at best, barely adequate solutions and at worst,
poor and predictable solutions. ‘Brainstorming’ is commonly thought in business and
other arenas to promote effective creativity. The evidence shows otherwise. The best
method is for each member of the group to spend a significant time alone working on
the problem and for the group to assemble to discuss the ideas that these individuals
have developed.

24
Personal Disposition

It is important to see creative thinking as something we all do, all the time, as
we react to unique circumstances of our lives. As such it’s something we all do – not
just the preserve of ‘creative people’. However, like all skills, some people are more
skilled than others, so we need to work on it if we want to be the best we can be.
Creative thinking skills go hand in hand with other thinking skills such as ‘rational
thinking’ skills (the sort of close critical analysis we engage in when we interrogate
evidence) and ‘reflective thinking’ skills (when we take our experience to a problem
or situation and use that to predict what might happen if we make a particular
decision or choose a particular course of action). Developing those skills should be
part of ongoing personal and professional development.

THE DECISION MAKING SPACE

Decision Making Tools

It can be extremely useful to see decision making as a discrete and separate


phase in the problem solving process. So, rather than leap to the solution immediately
and then ‘backward engineer’ evidence to support your decision, you might generate
a number of options and then choose the best one. There are a number of different
tools you can use, such as decision making trees, and various kinds of assessment
tools, such as SWOT analysis, where we identify each option’s strengths and
weaknesses, the opportunities it offers and the threats it might be prey to. There are,
too, decision making matrices that can be employed. These can be very useful in that
they can:

25
a. set out the process and thus create an auditable trail which can allow the
decision maker to explain their decision at a later date;

b. codify and quantify the inescapable subjectivity that will play a part in any
deliberations;

c. bind a group of individuals into a single process to which they all contribute;
and, perhaps most importantly,

d. offer a system in which each element is paired with another and each pair is
assessed using a single criterion at a time – the human mind can only make a
series of single decisions. So, for example, if you were asked to choose between
taking Soldier A, Soldier B or Soldier C on a mission you would be better advised
to first indentify the key criteria – say marksmanship and endurance – and then
consider each pair (A and B, A and C, B and C) separately against each single
criterion, rather than try to assess all three on both dimensions simultaneously.

Is the ‘best’ option the ‘right’ one?

There is a final issue to consider here and in many ways it is the most
important: the ethical dimension of decision making. This is complex territory that
raises as many questions as it answers. How might ethics and values be incorporated
into a model of problem solving and decision making? Is there a set of moral
principles that we take to every situation or do we derive the most ethical solution
from weighing up the rights and wrongs of specific issues at hand? Philosophers
often set ethical dilemmas to test these questions. Consider these famous examples:

a. You are travelling in a remote part of the world alone and unarmed. You
stumble into a small village to find a firing squad about to execute ten children
aged between 7 and 8 years. You express your horror and the commander of the

26
firing squad offers to spare 9 children as long as you choose the one that should be
executed. Do take up the offer? And if you do, how do you choose?

b. A runaway train is travelling down a disused rail line. It will run into and kill
10 people who are picnicking on the line. You can prevent this by changing the
course of the train to a side line. However, there are two workers on the side line
who will be killed if the train is diverted. What do you do?

c. A man buys a chicken at a supermarket. He takes it home and has sex with it.
He then cooks and eats the chicken. Explain why this is morally wrong.

That final example is drawn from the work of Jonathan Haidt, Professor of Social
Psychology at the University of Virginia, who runs www.yourmorals.org, a website
dedicated to exploring the relationship between morals and emotions.

More Questions than Answers

The lessons in CABS and elsewhere in the Academy will, we hope, help to
develop the Officer Cadets’ problem solving and decision making, but there is much
that they will need to resolve personally and wider issues that the Army needs to
address collectively:

• How do we balance a use of effortful problem solving strategies with intuitive


strategies?

• How do we prepare men and women to cope with environmental factors – such
as stress or the need for quick decisions – and how will these affect their
problem solving?

27
• How do we incorporate ethics and values into problem solving and decision
making?

• How does personality affect an individual’s problem solving style?

• What effect do personal and organisational attitudes towards risk have on the
decisions that individuals make?

Further reading

Baron, J (1985) Rationality and Intelligence New York: Cambridge University Press
Baron, J (2008) Thinking and Deciding New York: Cambridge University Press
Dawes, R (2001) Everyday Irrationality Colorado: Westview Press
Gilovich, T, Griffin, D and Kahneman, D (eds) (2002) Heuristics and Biases: The Psychology of
Intuitive Judgment New York: Cambridge University Press
Haidt, J (2001). The emotional dog and its rational tail: A social intuitionist approach to moral
judgment. Psychological Review, 108, 814-834
Kahane, A (2007) Solving tough problems: an open way of talking, listening, and creating new
realities California: Berrett-Koehler
Kingman, D and Tversky, A (1972), "Subjective probability: A judgment of representativeness",
Cognitive Psychology 3: 430–454
Klein, G (2003) The Power of Intuition New York: Doubleday
Morgan, J (1998) The Thinker’s Toolkit New York: Three Rivers Press
Plous, S (1993) The Psychology of Judgment and Decision Making New York: McGraw-Hill
Rittel, H and Webber, M (1973) ‘Dilemmas in a General Theory of Planning’ Policy Sciences Vol
4 pp 155-169
Simon, HA (1956) ‘Rational choice and the structure of the environment’ Psychological Review
Vol 63 No 2, 129-138
Williams, BS (2010) ‘Heuristics and Biases in Military Decision Making’ Military Review Sep/Oct
2010, Vol 90 Issue 5, p 40-52

28

You might also like