Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Problem Solving - Reasoning - 08

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 73

Problem-Solving and Reasoning

November 6, 2008
The Problem with problem-solving research

“In field research, there is often too much


[complexity] to allow for definitive
conclusions, and in laboratory research,
there is usually too little complexity to
allow for any interesting conclusions”
Brehmer & Dörner (1993)
Computers in Human Behavior, 9, 171-183
Salient differences between puzzle
problems and real-world problems

• Puzzles • Real-world
– unfamiliar problems
– involve no prior – familiar
knowledge – require prior knowledge
– all necessary info. is – necessary information
present in the often absent
problem statement – solver must ask ‘what is
– requirements are the goal’?
unambiguous
Problem Examples
• Water jug problem
• Two-string problem
• Nine-dot problem
• Candle Box problem
• Missionaries and cannibals
• Tower of Hanoi
You have three containers, one holding 8 quarts, one holding 5
quarts, and one holding 3 quarts. Starting with the 8-quart
container full of water, and using no other measuring devices, give
me back two containers each containing 4 quarts of water.
2-string problem
Candle Box Problem (Duncker, 1945)
Gestalt Viewpoint
• Problem-solving is both reproductive and
productive
• Reproductive PS involves re-use of previous
experience (can be beneficial or detrimental)
• Productive problem-solving is characterized
by restructuring and insight
• Insight accompanied by subjective “ah-ha”
experience
Clinical Psychology Graduate Student constructing
office furniture for student workspaces
Gestalt Contributions
• Perception more than just association –
it involves conceptualization
• Functional Fixedness can hinder
problem-solving (candle box problem)
• Problem restructuring: productive
• Development of insight
• Implication: importance of problem
representation
Information-Processing Approach
to Problem-Solving
• Problem-Space Theory
– solving a problem involves negotiating alternative
paths to a solution
– initial state is linked to goal state by a path
– knowledge states are produced by the application
of mental operators
– algorithms vs. heuristics are used to move along
the path
– limited processing resources provide constraints
on the degree to which multiple moves can be
considered
Ohlsson’s Insight Theory
• Gestalt findings can be reinterpreted within PST
– multiple mental representations of the same problem
– specific knowledge operators needed are retrieved
from memory
– current representation of the problem acts as a
memory probe
– impasses in problem-solving are solved through ‘re-
representation’
• elaboration
• constraint relaxation
• restructuring or recategorization
Routine v. Insight Problems: A
useful distinction?
• Key Concept: insight and trial-and-error (routine)
problems involve subjectively different experiences
• Key Debate: “Special Process” vs. “Business as
Usual”
• Routine: problem-solvers good at predicting their
success; monitor accurately how close they are to
solution
• Insight: problem-solvers poor at predicting success;
can’t monitor closeness to solution
– “What can move large logs but cannot move a small nail?”
Neurobiology of Insight

Bowden, et al. (2005).


Trends in Cog Sci
Problem Isomorphs
• Similar formal structure of two problems
• Reasoning by analogy
• Similarities often very difficult to detect
if the problems do not have identical
structure (an impediment to
generalization)
• Military vs. radiation problem
Duncker’s (1945) Radiation Problem

Suppose you are a doctor faced with a patient who has a malignant
tumour in his or her stomach. It is impossible to operate on the
patient, but unless the tumor is destroyed the patient will die. There
is a special type of ray that can be used to destroy the tumor, as long
as the rays reach the tumor with sufficient intensity. However, at the
necessary intensity, the healthy tissue that the rays pass through will
also be destroyed and the patient will die. At lower intensities, the
rays are harmless but they will not affect the tumor either. What
procedure might the doctor employ to destroy the tumor with the
rays, at the same time avoiding destroying any healthy tissue?
Duncker Radiation Problem
Information-Processing Approach
to Problem-Solving
• Problem-Space Theory
– solving a problem involves negotiating alternative
paths to a solution
– initial state is linked to goal state by a path
– knowledge states are produced by the application
of mental operators
– algorithms vs. heuristics are used to move along
the path
– limited processing resources provide constraints
on the degree to which multiple moves can be
considered
Important Ideas in Problem-Space
Theory
• Problem-space refers to the abstract structure of a problem
• Operators are specific knowledge structures that transform
data
• Algorithm: method or procedure
• Heuristics: strategies, “rules of thumb”
– means-end analysis: calculate difference between
current state and goal; create a subgoal to reduce that
difference; select an operator that will solve this subgoal
– Anti-looping heuristic: don’t go further from the goal
than you currently are
• Subgoal structure – essentially are short- and
long-term goals (interim v. final destinations)
http://www.learn4good.com/games/puzzle/boat.htm
Progress Monitoring Theory (MacGregor,
Ormerod, and Chronicle, 2001)

• Two general problem-solving heuristics


– Maximisation heuristic – making the most
headway possible
– Progress monitoring – assessing rate of progress
toward goal
• Criterion failure (“wake up call”) causes
problem solvers to seek an alternative
strategy and can be important in obtaining a
solution – problem solved better when
wakeup is received
Four versions of the eight-coin problem. The dark shading indicates
that one coin was on top of another coin. The figures on the right have
valid two-dimensional moves, whereas those on the left do not. From
Ormerod et al., (2002). Copyright © 2002 by the American
Psychological Association. Reprinted with permission.
Evaluation
• Insight appears to be dependent on
– Constraint relaxation
– Combined with criterion failure
– Problem solvers who realize that means–ends
analysis is proving unsuccessful are more
responsive to changing their strategy than are those
for whom means–ends analysis is at least partially
successful
• Jones (2003)
– Previous experience with related problems is also
important
Mental Models in Problem-
Solving
• Developing an understanding of the
formal structure of the problem
• Imagistic and propositional processing
• Fleshing out formal implications of
problem by seeking examples and
counterexamples, or by playing out
real-world implications
Flagpole Problem
• Two flagpoles are standing, each 100
feet tall. A 150-foot rope is strung from
the top of one of the flagpoles to the
top of the other and hangs freely
between them. The lowest point of the
rope is 25 feet above the ground. How
far apart are the flagpoles?
The Singles Bar
• While sitting in a club where all single men
tell the truth and all married men lie, a
woman is approached by three men. She asks
the first guy if he is married, but the music is
so loud that she can't hear his answer. So she
turns to the second guy, who tells her, "The
first guy said, 'I am married,' but he really is
single." Then she turns to the third guy, who
says, "The second guy is single." Determine
the marital status of each of the three men.
Obstacles to Problem-Solving
• Mental sets, entrenchment, and fixation
(viewing the problem from the
“dominant paradigm)
• Negative transfer
• Memory load/interference – importance
of incubation
Decision-Making and
Reasoning
Tversky & Kahneman, 1983, p. 297
Results
• “h” rated as more likely than “f” 85% of the
time!
• But Joe, it can’t be so! The probability of “h”
cannot be higher than “f”, since “h” is a
subset of “f”
• Illustrates that humans are often not the best
at thinking tasks, and that they are extremely
susceptible to aspects of the informational
environment that can “tip” or “bias” them to
think or act in particular ways
Reasoning Research
• Goal of judgment and decision-making is to
select from choices
• Goal of reasoning is to draw conclusions
deductively from principles (e.g., applying laws
of physics to determine power of an engine)
and inductively from evidence (e.g., using
safety statistics to draw inferences about the
safety of a particular car)
Decision-Making
Classical Decision Theory
• Assumes “rational man” - based on
economics
– fully informed regarding options and
outcomes
– sensitive to subtle distinctions between
options
– fully rational with regard to choice of
options
Expected Utility Theory
• Seek to maximize positive utility
(pleasure)
• Seek to minimize negative utility (pain)
• Components:
– subjective utility: based on individual’s
judged weightings of utility
– subjective probability: based on individual’s
judged weightings of probability
Which job should I take?
Company A: 50% chance of a 20% salary increase the first year
Company B: 90% chance of getting a 10% salary increase the first year

Classical Decision Theory • Expected Utility Theory


• Calculate expected value for • Assign individual subjective
each option: weighting to various factors
– Company A: .5 x .2 = .10 (salary, health insurance, etc.)
– Company B: .9 x .1 = .09 • Assign individual subjective
weights to various probabilities
• Perform similar calculations of obtaining positive utility
for other factors (e.g., health
insurance, severance package, (strategy important)
vacation allowance, job • Calculate: [p(pos)] -  [p(neg)]
satisfaction) • Choose Company based on the
• Assuming other things equal, sum of expected positives -
choose job with Company A negatives
Clinical Applications of Utility Theory
– Time Trade-Off Techniques
• "Imagine that you are told that you have 10 years left to live. In
connection with this you are also told that you can choose to live these
10 years in your current health state or that you can choose to give up
some life years to live for a shorter period in full health. Indicate with a
cross on the line the number of years in full health that you think is of
equal value to 10 years in your current health state“ If the person puts
the line on 4, the TTO is .4
• Patient presented with iterative choices until s/he is indifferent to the
choice; e.g., 20 blindness v. 5 perfect health, v. 10 perfect health, etc.
If the below choice is the indifference point, the health utility of one-
eye blindness is 17/20 = .85
Clinical Applications of Utility Theory
(cont’d)
• Standard Gamble Technique
– Patient ranks health care states along a continuum, and then is asked to
make a choice like the one below; relative size of the “death” region (i.e.,
risk) is iteratively changed until person is indifferent to choice
Prospect Theory
(Kahneman & Tversky)
• Describes how individuals evaluation losses and
gains – generally, we are “loss averse”
• Two stages
– Stage I: Editing: outcomes ordered following some heuristic;
set a reference point
– Stage II: Evaluation: compute utility value and choose
• Explains a variety of economic behaviors
– Status quo effect – insurance example (23% NJ, 53% PA)
– Endowment effect – coffee cup examples
– Sunk cost effect – vacation example
Prospect Theory Value Function (not
symmetrical; indicates ‘loss aversion’)

Value function depicts risk-aversion with gains


(particularly moderate probability with low probability
losses), and risk-seeking with losses (particularly
moderate probability with low probability gains).
Prospect Theory – Example Applications

• Understanding white collar crime: covering up minor crimes


(failure to cut losses)
• Iraq war, other examples of organizational inertia– sunk cost
effect?
• Stock investing - Why do so many financial investors hold onto a
stock that has plummeted far more frequently than they keep a
stock that has risen sharply, or that has maintained a steady
price?
• Health decision-making: risk-taking in bad situations (e.g.
HIV/AIDS)
Framing (Prospect Theory)
• GAIN FRAMING: 600 people are at risk of dying of a
particular disease. Vaccine A could save 200 of these
lives. For Vaccine B, there is a .33 likelihood that all
600 people would be saved, but a .66 likelihood that all
600 people will die. Would you choose A or B? (most
choose A)
• LOSS FRAMING: 600 people are at risk of dying of a
particular disease. If Vaccine C is used, 400 of these
people will die. If Vaccine D is used, there is a .33
likelihood that no one will die, but a .66 likelihood that
all 600 people will die. Would you choose C or D?
(most choose D)
Anchoring and Framing Effects
• Anchoring effect (actual answer = 40,320)
– Estimate: 8x7x6x5x4x3x2x1 (estimate is 2,250)
– Estimate: 1x2x3x4x5x6x7x8 (estimate is 512)
• Framing effects
– the way that options are presented affects option selection
• risk aversion when presented with a gain options (pick
small but certain gain over large but uncertain one)
• risk seeking when presented with potential losses
(choose large, uncertain loss rather than smaller, certain
loss)
Satisficing (Simon)
• Reaching “acceptable” goals
• Notion of “bounded rationality”: rationality,
but within limits
• Do not consider total range of options, but
consider options one by one until one meets
our minimum standards of acceptability
• Probably don’t reach optimal solution, but
also don’t spend eternity searching for one
(e.g., selecting a graduate school, selecting
crackers)
Elimination by Aspects (Tversky)

• Consider one aspect (attribute) of


available options
• Form minimum criterion for that aspect
• Eliminate all options that don’t conform
to minimum criterion
• Then select a second aspect…and so on
Models of Probability Judgments
• Descriptive: how people reach decisions
(naturalistic observation)
• Normative: how a decision should be made
using unlimited resources.
– Bayes’ Theorem: Conditional probability

Prior probability
P(E|H) x P (H)
P(H|E) = P(E|H) x P(H) + P(E|not H) x P(not H)

= (.9 x .05)/(.9 x .05)+(.1 x .95)


= .32
Bayes Theorem Applied
• In previous example, two types of probabilities exist:
– “prior probability”: probability that event will
occur given similar prior circumstances (e.g., p = .
05 that your friend will invite your ex-husband to
the party)
– “conditional probability”: probability that new
information is true if a particular hypothesis is true
(e.g., p = .90 that the car you see parked belongs
to him)
Are we accurate probability
calculators?
• Probably not…we’re more conservative
• Edwards (1968): drawing chips, with
replacement, from one of two bags with
70/30 mix of red/white chips. If first chip is
red, what’s the probability that the second
chip will also be red? Actual p=.70 (subjects
say p=.60)
• Meehl’s criticisms of clinical decision-making
and the clinical-actuarial debate
Probability Judgments
• Three candidates, A, B, and C are running for
Mayor of Gainesville. In 6 separate polls, A led B
five times. In 18 polls, C led B 9 times. In a
comparison of A and C, who is more likely to win?
• It is known that 5% of the population is affected
by rubadubitis. A new diagnostic test gives true
positives of the disease 85% of the time, but has
a 10% false positive rate. Bub has tested
positive. What is the probability that he hs
rubadubitis?
Common Heuristics in
Probability Judgments
• Frequency Heuristic: making use of
number of occurrence, rather than
probability of occurrence
– candidate example: C has more wins, but
A has greater proportion of wins (5/6);
most people choose C
Tversky & Kahneman, 1983, p. 297
Common Heuristics (cont’d)
• Representativeness Heuristic: making
choices based on how similar/representative
a person or sample is, rather than relying on
calculated probability
– fail to use conjunctive rule: Linda is regarded as
“representative” of a feminist, so most people
choose “b”
– fail to use baserates: rubadubitis example,
estimates are around .85 (actual answer is .31)
Common Heuristics (cont’d)
• Availability Heuristic: using most salient,
or apparent answer to guide judgment
– Which is more likely: death by tornado or death
by asthma? (asthma)
– Is the letter “k” more likely to occur in the first or
third position in English words? (3rd)
• Conclusion: people aren’t very good at
calculating probabilities; they rely on
heuristics
Heuristics and Biases
(Kahneman & Tversky)
• People commonly use short-cuts (heuristics)
• Heuristics lighten cognitive load, but lead to
greater biases and errors
• Example heuristics:
– REPRESENTATIVENESS: how representative
instance is of universe
– AVAILABILITY: how easily instances are called to
mind
Examples
• All families having exactly 6 children in
Pleasantville were surveyed. In 72 families,
the exact birth order was GBGBBG. What is
your estimate of the number of families in
which the birth order was BGBBBB?
• What percentage of men in a health survey
have had one or more heart attacks? What
percentage of surveyed men both are over 55
and have had one or more heart attacks?
(conjunction fallacy)
Part II: Reasoning
Truth Tables and Logical
Operators
• Concept of propositional calculus
(assertion that is either true or false)
• Limited number of operators: not, and,
or, if…then, if and only if
• Truth tables chart truth value of
proposition by laying out state-of-world
possibilities
• Use of conditional logic
Truth Tables allow the logical, abstract structure of a reasoning
problem to be specified, further permitting analysis of whether
humans reason this way (they often don’t!)

P=it is raining “true” in the sense that there are


no grounds for falsifying it
Q=Alicia gets wet
Forms of Conditional Reasoning,
based on “If P then Q”
• Valid Forms
– Modus Ponens: P,  Q
– Modus Tollens: not Q,  not P
• Invalid Forms
– Affirming the Consequent: Q,  P
– Denying the Antecedent: not P,  not Q
• Additional or alternative antecedents
affect the use of inferential forms
Theories of Reasoning
• Abstract-Rule Theories: reasoning proceeds
much like logical proofs
• Domain-Specific-Rule Theories: reasoning
based on schematic rules specific to the type
of problem (Wason’s selection task)
• Model Theories: reasoning proceeds using
mental models of the world (syllogisms)
• Bias Accounts: reasoning as a product of
nonlogical tendencies (believability bias)
Abstract-Rule Theory
• Natural language premises (If A, then B) encoded by
a comprehension mechanism; this mechanism is
normally rational but can be derailed
• Representation of premises is related to elementary,
abstract reasoning rules (e.g., modus ponens)
• If these rules do not produce conclusion, then non-
logical processes are invoked
• Types of errors
• comprehension: premise misconstrued
• heuristic inadequacy: poor strategy
• processing: attentional, working memory lapses
Abstract-Rule Account of Invalid
Inferences
• Premises are re- or mis-interpreted
• Importance of “co-operative principle”
(speaker tells hearer exactly what they
think the hearer should know); hearer
then makes invalid inferences
– e.g.: the only way Alicia can get wet is if it
rains on her
Status of Abstract-Rule Theory
• Can account for rule-based inference
problems and for effects of alternative
and multiple antecedents
• Comprehension component
underspecified
• Applicable only to propositional
reasoning situations
Domain-Specific Knowledge and
Reasoning
• Posit types of situation-specific rules that are
used to solve reasoning problems
(probabilistically based):
– specific prior experience
– schemata for different types of situations (e.g.,
permissions, obligations)
• Rules have specific form that can be applied
in all situations corresponding to that schema
Model Theory
• Three processes:
– comprehension of premises: semantics and
analogy
– combining/description: models of simple
premises are combined to form integrated model
– validation: search for counterexamples or
alternative models disconfirming the conclusion
• Models consume processing resources
• Errors arise from inadequate models
Rule v. Model Theory – an
example
• Problem 1 • Problem 2
– A is to the right of B – B is to the right of A
– C is to the left of B – C is to the left of B
– D is in front of C
– D is in front of C – E is in front of B
– E is in front of A
• What is the relation between
• What is the relation between
D and E? D and E?
• Model 1: C A B
D E
• Model: C B A
D E • Model 2: A C B
D E
Conclusions (from both models:
Conclusion: “D is to the left of “D is to the left of E” (46%
E” (70% accurate) accurate)

Rule-based theory says Problem 1 harder (more premises


needed), MMT says 2 is harder (more models needed).
All of the artists are beekeepers.
Some of the beekeepers are clever.
Model Theory (cont’d)
• Valid Inferences
– develop and “flesh out” models based on
propositions
– working models out may take up processing
resources
• Invalid Inferences
– incorrect initial models (e.g., confusing
biconditional with conditional)
– can account for context effects; additionals serve
as counterexamples
Bias Theory

Are the conclusions in (1-4) true or false? Green is “believable”;


Red is “unbelievable”.
Basic idea: we accept conclusions based on their believability
(green are believable), rather than on whether they truly follow
from the premises

You might also like