Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
11 views

Module 2 Behaviorism

Uploaded by

hudaaleassa2000
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Module 2 Behaviorism

Uploaded by

hudaaleassa2000
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 74

Do you remember?

• Piaget’s stages:
• Sensorimotor stage newborn to about 2 years old
• Start developing the concept of object permanence
• Pre-operational (2-7)
• Start developing the concept of conservation
• Concrete operational (7-11 years)
• Have the concept of conservation but lack deductive reasoning skills
• Formal Operational (11+)
Module 2: Behaviorism
Objectives

• Discuss the assumptions and goals of behaviorism.


• Define classical conditioning and describe the procedures for producing
and measuring it.
• State the procedures for extinction in classical conditioning.
• Explain how classical conditioning pertains to drug tolerance.
• Describe how Pavlov tried to explain classical conditioning, and cite later
• Evidence that calls for a different explanation.
Activity 1
Write out on a paper everything you are feeling and experiencing .
Include as many details as possible.
The Beginning
• Wilhelm Wundt (1832–1920) started the first formal
psychology research laboratory at the University of
Leipzig, Germany
• Psychology as a science. Much of the methodology that
accompanied the introduction of scientific inquiry into
behavioral areas was borrowed or adapted from other
sciences. Physics, chemistry, biology, and physiology
were all important contributors to the start of
psychology.
Structuralism
• Structuralism. The position developed by Wundt and later expanded by Edward Titchener
(1867– 1927) was called Structuralism. Psychology for the Structuralists was the study of
the introspective reports of normal human adults. Trained subjects made descriptive
reports of what they believed were the elements of stimuli presented to them.

Example: In a Structuralist experiment, you might be asked to report how you sensed the
weight, color, and texture of this book. You also might be asked to describe your feelings, if
any, toward the book. Merely saying, ‘‘This is a psychology book,’’ would not be sufficient
as an introspective report.
Functionalism
James (1842-1910)
• Functionalists were concerned with the purposes of behavior rather than the structure of
the mind.
• Functionalists generally adopted a broader view of psychology than did Structuralists.
This allowed them to study all age groups and a variety of subjects.
• Many new areas of investigation resulted, including the study of motivation and emotion,
child psychology, animal experimentation, and various areas of applied psychology.

Example: Why do we see red? What is the purpose of color vision? Do other species see
color as humans do? Why or why not?
Do you remember?
1) ________is how the study of psychology started. Psychologists
would put adult participants in a controlled area and have them
describe their emotions. They called these _____reports.
2) This approach to psychology focused on the purpose of
psychological functions and was inspired by Darwin.
Activity 2:
• Pick a partner.
• One person scans the below and watches the video. The other
person describes the reaction of the other person without the use of
any words that mean emotions

Video 1 Video 2
Behaviorism
John Watson (1878–1958)
• Established a system for the study of behavior in which it was believed that only the
observable responses made by the subject were relevant. Behaviorists denied the
concept of mind because a mind could not be observed. Their goal was to identify
orderly, lawful stimulus-response relationships.

Example: Behaviorists were interested only in observable phenomena. A strict Behaviorist


would not describe a person as ‘‘happy,’’ because happiness is a state of mind and mind is
not observable. Instead, a Behaviorist might describe the person’s smile or laugh, noting
the observable response to a stimulus.
Behaviorism
1. Observable behavior, rather than internal mental events or verbal
reconstructions of events, should be the focus of study.
2. Behavior should be studied in terms of its simplest elements (specific
stimuli and specific responses). Examples of behavioral reactions
investigated by the early researchers include reflexes, observable
emotional reactions, and motor and verbal responses.
3. The process of learning is behavioral change. A particular response
becomes associated with the occurrence of a particular stimulus, and
occurs in the presence of that stimulus.
Do you remember?
• What are the 3 approaches to psychology we have covered so far?
Unconditioned Reflex
Unconditioned Reflex
Unconditioned
Response- A
response that is
elicited by an
unconditioned
stimulus without
prior learning

Unconditioned
Stimulus- A
stimulus that
This Photo by Unknown Author is licensed under CC BY elicits a specific

Reflex is involuntary and are unconditioned as


meaning not learned
What happens if a dog hears a
metronome?
Design an experiment to train a dog
to salivate when hearing a
metronome
Conditioned Stimulus (CS) and
Response (CR)
• Conditioned Stimulus– A neutral stimulus that after repeated pairing with an
unconditioned stimulus becomes associated with it

• Conditioned Response– The learned response that comes to be elicited by a conditioned


stimulus as a result of repeated pairing with an unconditioned response

• Higher Ordered Conditioning – When a link has been established between a conditioned
stimulus and conditioned response. Thus, a new stimulus can be introduced
cs, ucs, cr,
Classical u ucr n
Conditioni r s
ng uc
1) A natural relationship s
must exist between a
stimulus (object or
event) and a reaction
2) The stimulus that elicits c
the reaction is paired
with a neutral stimulus, r
typically for several c
trials
s
This Photo by Unknown Author is licensed under CC BY
You try: label the following as unconditioned
stimulus, unconditioned response, conditioned
stimulus, conditioned response

Unconditioned Stimulus= drilling Unconditioned Response=Tension


Conditioned Stimulus= Sound of a drill conditioned Response=Tension
Pavlov’s
Further
Time

Experim
ent
Remember:
Buzzer and the puff of air in eyes experiment label
the UCS, CS, UCR, CR
What can you assume will be the difference
between group 1’s response to the tone vs group 2?

conditionin
g
Explain the below chart
Extinction
• Extinction-unlearning where the conditioned stimulus no longer
results in the conditioned response
• Why does this happen?
• Does extinction result in a permanent change?
Remembering Extinction
What is the difference between
structuralism and functionalism?

What is the difference between


Do you how behaviorists and structuralists
recall? learned about the mind?
What is classical conditioning?
Does this relate to
generalization or
discrimination? If so,
“Similar tone experiment”
how
• Step 1. The dog is conditioned to salivate in response to the tone C.
• Step 2. Generalization, the dog salivate as to a range of musical tones above and below a
particular tone, the dog salivates less and less as the tone moves away from C
• Step 3. The original tone C is repeatedly paired with food. Neighborhood tones are also sounded
but they are not followed by food. The dog is conditioned to discriminate
Generalization and
Discrimination

Generalization- The tendency to make a conditioned response to a


stimulus that is similar to the original conditioned stimulus The more
similar a new stimulus is to the original reinforced stimulus, the more
likely is the same response.

Discrimination – The learned ability to distinguish between similar


stimuli so that the conditioned response occurs only to the original
conditioned stimulus but not to similar stimuli
Watson on
Emotional
Conditioni
ng
Experiment:
“Little Albert and Peter”
Generalization
and
Transference

Watson’s Experiment with Albert

• Albert’s contact with any animal led to


the emotional response of crying

• Albert's conditioned emotional response


of crying on the presentation of a rabbit,
which had not been paired with the loud
sound, demonstrated Freud's concept of
transference
Conditioned
Emotional
Reactions

• Through paired association,


positive and negative reactions
may be conditioned to a variety of
objects and events.
• Example: Parental disgust
reactions when confronted with
spiders facilitated the children's
acquisition of spider fear
(deJong,Andrea, & Muris, 1997)
How are the
concepts of classical
conditioning relevant
in the classroom?
Children in Lusail School for Cool Kids are not reading enough. The principal has
given the teachers a goal to increase the number of books students read and
will hold a pizza party for the class with the most read books. Which of the
teacher’s approaches is best to create lifelong readers and why? Use the
concept of classical conditioning to answer.

Teacher A’s approach: require students to read a book a week, give students a
quiz on the pages the students should have read, if the student has not read,
the student loses play time.

Teacher B’s approach: the students have a comfortable reading corner with
bean bag chairs, calm colors and a selection of exciting books.
Remind Us
• What was the Baby Albert Experiment and why was it significant?
Classical Conditioning in the Classroom

For example, for some children, unfamiliar situations generate anxiety reactions. Introducing a
difficult activity, such as a mathematics activity, on the first day of school may lead to the
association of an anxiety reaction to mathematics or school.

For example, sustained reading is an important activity in learning to appreciate literature.


Carpeting one corner of the room and furnishing it with large sofa cushions to create an area
for sustained reading may, over time, elicit positive reactions to the free-time reading
included in the daily schedule.
Implication to practice
Thorndike’s Puzzle Box
Thorndike’s work with animals

Thorndike concluded that:

Behavior changes because of it’s ___________


He called this the law of _______
Reinforcement
• A reinforcement is the process of increasing the future probability of
the most recent response.
• What kind of reinforcements can allow a child to read more?
Operant Conditioning
• The kind of learning that Thorndike studied is called operant
conditioning (because the subject operates on the environment to
produce an outcome) or instrumental conditioning (because the
subject’s behavior is instrumental in producing the outcome).
• Operant or instrumental conditioning is the process of changing
behavior by providing a reinforcer after a response
Connectionism
• Edward Thorndike's connectionism is typically referred to as a
behaviorist theory
• It differs from classical conditioning in two major ways.
• First, Thorndike was interested in mental processes, and he
designed his first experiments to address the thought processes of
animals.
• Second, instead of reflex or involuntary reactions, Thorndike
researched voluntary or self-directed behaviors.
Connectionism
• The theory is known as connectionism because the animal establishes
connections between particular stimuli and self-initiated behaviors.
• During the series of trials in the experiment, the correct response was
gradually "stamped in" or strengthened.
• Incorrect responses were weakened or "stamped out."
• In other words, problem solving involves establishing associations or
connections between the stimulus (the problem) and appropriate
response
The Laws of Learning
• law of effect states that a satisfying state of affairs following the
response strengthens the connection between the stimulus and the
appropriate behavior, and an annoying state weakens the connection.
• law of exercise states that repetition of the experience increases the
chances of a correct response.
• Third, the law of readiness describes the conditions that govern the
states referred to as "satisfying" or "annoying."
Think as an
educator…
What are your thoughts about
the law of exercise. When is this
applicable, when is it not?
Revisions of the Law of
Exercise
Thorndike discarded the Law of Exercise when he found that simple
repetition of a situation does not necessarily “stamp in” responses
Educational Implications
• Facilitating Transfer Thorndike suggested that drilling students on a specific skill
does not help them master it nor does it teach them how to apply the skill in
different contexts. When teachers instruct secondary students how to use map
scales, they also must teach them to calculate miles from inches. Students
become more proficient if they actually apply the skill on various maps and
create maps of their own surroundings than if they are just given numerous
problems to solve. When elementary teachers begin working with students on
liquid and dry measurement, having the students use a recipe to actually
measure ingredients and create a food item is much more meaningful than
using pictures, charts, or just filling cups with water or sand. In medical school,
having students actually observe and become involved in various procedures or
surgeries is much more meaningful than just reading about the conditions in
textbooks
Homework
• Create a one page concept map or visual which compares structuralism,
behaviorism, functionalism, and connectionism

• Make sure to add the following terms:


• Classical conditioning
• Operant conditioning
• Thorndike
• Watson
• Pavlov
• James
• Wundt
How can we teach a rat to play
basketball?
Shaping
• Shaping is a procedure in which an experimenter successively
reinforces behaviors that lead up to or approximate the desired
behavior.
“Shaping: Facing the Bar”
Skinner: Operant Conditioning
and Learning
• Operant Conditioning- The type of learning in which the frequency of voluntary
behavior changes as the result of the consequences that the behavior produces

• Skinner manipulated the consequences that occurred when animals carried out a
certain behaviors

• Reinforcer- Anything that strengthens or increases the probability of the


response
Shaping: Maze Experiment
• Skinner’s research on operant conditioning led him to conclude that
simply reinforcing small acts can condition complex forms of behavior.
Positive and Negative
Reinforcement
• Positive reinforcement refers to the presentation of a stimulus that
increases the probability that a behavior will occur again
• Adding something nice to increase the chance
• Negative reinforcement refers to an aversive (unpleasant) stimulus
whose removal increases the likelihood that the preceding response
will occur again
• Increase of behavior by subtracting something that is unpleasant
Examples: Positive or Negative
Reinforcement

“Rat learns to
“University
“rat learns to press press a lever to
students study
a lever to obtain turn off an
more often after
food pellets” annoying loud
getting an A”
buzzer”

Positive Positive Negative


Consequences and
Reinforcement
• Consequences are dependent on behavior
• Reinforcement is a consequence that occurs after a behavior
and increases the chance that the behavior will occur again
• Punishment is a consequence that occurs after a behavior and
decreases the chance that the behavior will occur again
Can there be positive and
negative punishment?
Punishment: Positive Vs.
Negative
• Positive Punishment – Decreasing behavior after the addition of a
consequence
1) Adding chores for a child who makes a mess
2) Adding homework for students who waste time

Give an example of a negative punishment?

Taking away the tablet


Educational Implications
• What are 4 ways a teacher can deal with a child who speaks when the
teacher is speaking?

Positive Negative

Reinforcement

Punishment
Educational Implications
• Young children often learn to operate on their environment (that is,
their parents) by crying. This seems particularly true at times when
children do not want to go to bed, even though they are very tired
and need sleep. What parental response is necessary to eliminate this
type of crying?
Student Reflection:…
Reinforcement and Punishment
Punishment does not greatly weaken a response when no other response is
available

“Research on spanked versus timeout children”

• Positive punishment refers to presenting an aversive (unpleasant) stimulus


after a response. The aversive stimulus decreases the chances that the
response will reoccur.
• Negative punishment Negative punishment refers to removing a reinforcing
stimulus (a child’s allowance) after a response. This removal decreases the
chances that the response will recur.
Primary and Secondary
Reinforcers

A secondary reinforcer is any


A primary reinforcer is a stimulus, stimulus that has acquired its
such as food, water, or going out reinforcing power through
that is innately satisfying and experience; secondary reinforcers
requires no learning on the part of are learned, such as by being
the subject to become pleasurable. paired with primary reinforcers or
other secondary reinforcers.
Educational Implications

A student learns that a good grade wins approval.

employee learns that increased productivity wins the employer’s praise.

In these cases, secondary means “learned.” It does not mean unimportant. We


spend most of our time working for secondary reinforcers.
Skinner
Experiment on
Punishment
Trained food-deprived rats to press a bar to
get food and then he stopped reinforcing
their presses. For the first 10 minutes, some
rats not only failed to get food, but also
received a slap on their paws every time
they pressed the bar. (The bar slapped their
paws.) The punished rats temporarily
suppressed their pressing, but in the long
run, they pressed as many times as did the
unpunished rats.

What can we conclude?


Disadvantages of Punishment

• Punishment does not extinguish an undesirable behavior it only suppresses it


• Punishment indicates that behaviors that are unacceptable, it should be
administered with a reward
• Severe punishment leads to resentment, fear and anger
• Frequent punishment leads to aggressiveness
Drawbacks to Punishment

• Applied to misbehavior– interrupting the behavioral problem. The longer the


delay between the responses and the punishment the less effective the
punishment
• The purpose of punishment is to modify the behavior and not vent anger.
Increasing the intensity of punishment is not effective
• Punishment must be consistent and not applied one day and removed in
another
Schedule of Reinforcements

• Continuous Reinforcement- Reinforcing every desirable behavior by a


consequence
“Skinner reinforcing a Bar-pressing response with a food pellet”

Is this realistic in real life? Why or why not?


Partial Reinforcements
Different rates and schedules produce different ratio and interval
schedules:
• Partial Reinforcement – A pattern of reinforcement in which some
but not all correct responses are reinforced
• Research has shown that partial reinforcement leads to greater
resistance to extinction than does continuous reinforcement.
Fixed-interval, Fixed Ratio, Variable-
interval, Variable Ratio
• Variable Ratio- reinforcement occurs after a variable number of correct responses.
• Example: If you enter a lottery, each time you enter you have some chance of winning, but you cannot
predict how many times you must enter before winning (if ever).

• Fixed Ratio – provides reinforcement only after a certain (fixed) number of correct responses—after every sixth
response, for example
• “After every 20 responses a reinforcement is given”

• Variable Interval: reinforcement is available after a variable amount of time.


• Example: Checking your email or your Facebook account is an example: A new message could appear at any
time, so you check occasionally

• Fixed Interval Schedule, provides reinforcement for the first response after a specific time interval.
• Example: Checking your grade on BB is an example of behavior on a fixed-interval schedule. If you know the
grades are going to be posted at about 3 p.m. You are eagerly awaiting an important package; you might
begin to check around 2:30 and continue checking every few minutes until it arrives. Showing up on time
for class is another example of a fixed-interval schedule.
Educational Implications
• ‘‘Gamblers-Camel players’’ often seem to be hooked on gambling. In
terms of schedules of reinforcement, why.

You might also like