Lecture 5 collecting_evaluation_data-1
Lecture 5 collecting_evaluation_data-1
Lecture 5 collecting_evaluation_data-1
1
Collect data
Who
What
Evaluation
questions
Sources of
Indicators: information:
Evidence that METHODS
program records,
answers your
individuals,
questions
public
2
Source of evaluation information
• Existing information
• People
• Pictorial records and observations
3
Quantitative: numbers breadth generalizability
Qualitative: words depth specific
4
Paradigm dimensions
Qualitative Quantitative
• Naturalistic inquiry • Scientific/experimental design
• Holistic, system-wide • Independent, dependent variables
perspective • Standardized, uniform
• Uniqueness and diversity • Deductive reasoning
• Inductive reasoning • Quantitative data (numbers)
• Qualitative data (words) • Quantitative methods − structured,
• Qualitative methods − standardized
unstructured, open-ended • Probabilistic, random sampling
• Purposeful sampling • Fixed, controlled design
• Emergent, flexible design • Statistical analysis
• Content analysis • Generalizations
• Extrapolations
5
Quantitative methods –
Qualitative methods
Quantitative Qualitative
6
Often, it is better to use more than one
method….
Mixed methods for one program
• Log of activities and participation
• Self-administered questionnaires
completed after each workshop
• In-depth interviews with key
informants
• Observation of workshops
• Survey of participants
7
Are the data reliable and valid?
8
“Trustworthy” and “credible” data
9
Common data collection methods
• Survey • Testimonials
• Case study • Tests
• Interview • Photographs,
• Observation videotapes, slides
• Group assessment • Diaries, journals,
logs
• Expert or peer
reviews • Document review
and analysis
• Portfolio reviews
10
When choosing methods, consider…
12
Quality criteria for methods
UTILITY
Will the data sources and collection
methods serve the information needs
of your primary users?
13
Quality criteria…
FEASIBILITY
Are your sources and methods
practical and efficient?
Do you have the capacity, time, and
resources?
Are your methods non-intrusive and
non-disruptive?
14
Quality criteria…
PROPRIETY
Are your methods respectful, legal,
ethical, and appropriate?
Does your approach protect and
respect the welfare of all those
involved or affected?
15
Quality criteria…
ACCURACY
Are your methods technically adequate to:
• answer your questions?
• measure what you intend to measure?
• reveal credible and trustworthy
information?
• convey important information?
16
There is no one right method of
collecting data.
18
Is a written questionnaire culturally appropriate?
Things to consider:
• Literacy level
• Tradition of reading, writing
• Setting
• Not best choice for people with oral tradition
• Translation (more than just literal translation)
• How cultural traits affect response – response sets
• How to sequence the questions
• Pretest questionnaire may be viewed as intrusive
19
Are interviews culturally appropriate?
Things to consider:
• Preferred by people with
an oral culture
• Language level proficiency;
verbal skill proficiency
• Politeness – responding to authority (thinking it’s
unacceptable to say “no”), nodding, smiling,
agreeing
• Need to have someone present
• Relationship/position of interviewer
• May be seen as interrogation
• Direct questioning may be seen as impolite,
threatening, or confrontational
20
Are focus groups culturally appropriate?
Things to consider:
• Issues of gender, age, class, clan differences
• Issues of pride, privacy, self-sufficiency, and
traditions
• Relationship to facilitator as prerequisite to
rapport
• Same considerations as for interview
21
Is observation culturally appropriate?
Things to consider:
• Discomfort, threat of being observed
• Issue of being an “outsider”
• Observer effect
• Possibilities for
misinterpretations
22
Cultural issues related to use of existing
data/records
• Need careful translation of documents in
another language
• May have been written/compiled using
unknown standards or levels of aggregation
• May be difficult to get authorization to use
• Difficult to correct document errors if low
literacy level
23
Culturally appropriate informed consent
24
Focus groups
25
Focus groups
26
Survey
27
Survey
28
Steps in planning a survey
29
Response rate
30
Response rate
# that answered
= response rate
# you contacted
31
Response rate
32
Response rate
•There is no standard response rate. “The
higher, the better.” Anything under 60% is
a warning.
•Why is high return important? It’s the only
way to know if results are representative.
•Address low response. How are people
who didn’t respond different from those
who did? Only describe your results in
terms of who did respond.
33
How to increase response rate
• Generate positive publicity for your survey.
• Over sample.
• Ensure that respondents see the value of
participating.
• Use a combination of methods.
• Make (multiple) follow-up contacts.
• Provide incentives.
• Provide 1st class postage/return postage.
• Set return deadlines.
• Make the survey easy to complete.
34
If response rate is low…
36
Document review − Sources
National Center for Education Statistics
http://nces.ed.gov
Census Bureau http://www.census.gov
Bureau of Labor Statistics http://stats.bls.gov
National Center for Health Statistics
http://cdc.gov/nchs/
Children’s Defense Fund
http://www.childrensdefense.org
Wisconsin Department of Public Instruction
Local school districts
ERIC searches http://www.eric.ed.gov/
County government
37
Document review –
Advantages of using existing data
38
Document review −
Issues in using existing data
40
Observation is used…
41
When is observation useful?
42
Observations
• Advantages • Disadvantages
– Most direct measure – May require training
of behavior – Observer’s presence
– Provides direct may create artificial
information situation
– Easy to complete, – Potential for bias
saves time – Potential to overlook
– Can be used in meaningful aspects
natural or – Potential for
experimental settings misinterpretation
– Difficult to analyze
43
Observation – Purpose, benefits
• Unobtrusive
• Can see things in their natural context
• Can see things that may escape conscious
awareness, things that are not seen by others
• Can discover things no else has ever really paid
attention to, things that are taken for granted
• Can learn about things people may be unwilling to talk
about
• Inconspicuous – least potential for generating
observer effects
• Least intrusive of all methods
• Can be totally creative – has flexibility to yield insight
into new realities or new ways of looking at old
realities
44
Observation – Limitations
1. Potential for bias
• Effect of culture on what you observe and
interpret
2. Reliability
• Ease of categorization
45
Observation – Ethical issues
• Unobtrusiveness is its greatest
strength; also potential for abuse in
invasion of privacy
• Can venture into places and gather
data almost anywhere
• Covert – overt
– Always consider ethics and human
subjects protection.
46
Types of observation
Structured Unstructured
Looking for Looking at
47
Steps in planning for observation
• Determine who/what will be observed.
• Determine aspects that will be observed
(characteristics, attributes, behaviors, etc.).
• Determine where and when observations will be
made.
• Develop the observation record sheet.
• Pilot test the observation record sheet.
• Train the observers and have them practice.
• Collect the information.
• Analyze and interpret the collected information.
• Write up and use your findings.
48
Who/what to observe
• People (individuals, groups,
communities)
– Characteristics
– Interactions
– Behaviors
– Reactions
• Physical settings
• Environmental features
• Products/physical artifacts
49
Observation – Example
If you want
information about… You would record…
Who uses a particular Total number of users
service broken down by gender,
age, ethnicity, etc.
Interactions between # and types of questions
youth and adults asked by each
Neighborhood safety ???
50
What to observe − Example
Exhibit on tobacco use at a county fair
Information needed:
Number of youth who visit the exhibit: age, gender,
cultural background
51
Example – Plans for observing
participation in an after school program
• Who: youth attending the program
• What:
– approximate age
– gender, cultural background
– length of time student stays in the program
• When: all hours the program is open for
one week each month during 2007
52
Recording your observations
Observations need to be recorded to
be credible. You might use:
– Observation guide
– Recording sheet
– Checklist
– Field note
– Picture
– Combination of the above
53
Observational rating scales
55
Training observers
56
Practice
57
Practice
58
Interviewing is…
59
Interviews are useful…
60
Interviews
61
Interviews
• Advantages • Disadvantages
– deep and free – costly in time and
response personnel
– flexible, – requires skill
adaptable – may be difficult to
– glimpse into summarize
respondent’s responses
tone, gestures – possible biases:
– ability to probe, interviewer,
follow-up respondent, situation
62
Types of interviewing
Structured Conversational
63
Type: Structured interview
64
Type: Guided interview
65
Type: Conversational interview
67
Interviewing tips
• Keep language pitched to that of
respondent
• Avoid long questions
• Create comfort
• Establish time frame for interview
• Avoid leading questions
• Sequence topics
• Be respectful
• Listen carefully
68
Recording responses
69
Questionnaires are…
• Data collection instruments used to collect
standardized information that can be
expressed numerically or through short
answers
• Basic instruments of surveys and structured
interviews
• Appropriate when…
– you want information from many people
– you have some understanding of the situation
and can ask meaningful questions
– information is sensitive or private − anonymous
questionnaires may reduce bias
70
Questionnaires
• Advantages • Disadvantages
– can reach large – might not get
numbers careful feedback
– provide for – wording can bias
anonymity client’s response
– relatively – response rate is
inexpensive often low
– easy to analyze – literacy demands
71
When should a questionnaire be used?
• Respondents can provide useful
information about the topic.
• You know what it is you want to know and
are reasonably sure that you can ask
standardized questions to get the
information.
• Respondents can be relied upon to provide
the information you need (perhaps with
incentives). This means they can
comprehend the questions and respond
properly, they are truthful, and they are
motivated enough to respond carefully.
72
Good questionnaires are NOT EASY!
73
Questionnaire design − Considerations
• Kind of information: What do you want to
know? Is the information already
available?
• Wording of questions and responses
• Formatting the questionnaire
• Pre-testing
• Cover letters and introductions
• When/where will the questionnaire be
distributed?
• How will returns be managed? How will
the data be analyzed?
• Who is responsible for each task?
74
Questionnaire design
75
Questionnaire design
• Write questions through your
respondent’s eyes.
– Will the question be seen as
reasonable?
– Will it infringe on the respondent’s
privacy?
– Will the respondent be able and willing
to answer the question?
• Be selective and realistic when
writing questions.
76
6 STEPS IN DEVELOPING EFFECTIVE
QUESTIONNAIRES
1. Decide what information you need.
2. Determine sample – respondents.
3. Develop accurate, user-friendly
questionnaire.
4. Develop plan for distribution, return,
and follow-up.
5. Provide clear instructions and a
good cover letter.
6. Pilot test.
77
Step 1: What information is needed?
• Be specific
• Need to know vs. would like to know
• Check to see if information exists
elsewhere
• What do you want to be able to say:
counts, percentages, relationships,
narratives
78
Step 2: Sample
79
Step 3: Develop questionnaire
80
Step 3 continued
81
Step 4: Plan distribution, return, follow-up
82
Step 5: Cover Letter − Explanation
• Purpose of questionnaire –
how information will be used
• Why they are being asked to fill it out
• Importance of their response
• How and when to respond
• Whether response will be anonymous or
confidential
• Your appreciation
• Promise results, if appropriate
• Signature − sponsorship
83
Step 6: Pilot test
• Always
• With people as similar to respondents
as possible
– Do they understand the questions? The
instructions?
– Do questions mean same thing to all?
– Do questions elicit the information you
want?
– How long does it take?
• Revise as necessary
84
Kinds of information –
What do you want to know?
• Knowledge − what people know,
how well they understand something
• Beliefs − attitudes, opinions
• Behaviors − what people do
• Attributes/Demographics − what
people are and what people have
85
Change in knowledge
Impact of divorce on children
As a result of this program, to what extent do you
understand the following about children and divorce:
b. Self-blame 1 2 3 4
or guilt
c. The desire 1 2 3 4
for parents to
reunite
86
Change in skills
Communication skills
List three communications techniques
you learned in this course that you
have used with your children:
1.
2.
3.
87
Change in attitude
88
Change in behavior
How visitation disputes are handled
89
Attributes –
What people are, what people have
90
Types of questions
91
Open-ended questions
92
Open-ended questions
Pros: Cons:
• Can get • More difficult to
unintended or answer
unanticipated • May be harder to
results categorize for
• Wide variety of interpretation
answers • More difficult for
• Answers in people who don’t
participants’ write much
“voices”
93
Open-ended questions
Examples:
94
Closed-ended questions
95
Closed-ended questions
Pros: Cons:
• Easy to analyze • Chance of none of
responses the choices being
• Stimulates recall appropriate
• Biases response to
what you’re looking
for
• Misses unintended
outcomes
96
Closed-ended questions
Example − one best answer:
97
Closed-ended questions
Example − multiple responses:
Of the communication skills taught in this
workshop, which will you use with your
children? (Check all that apply.)
active listening
acknowledge feelings
ask more open-ended questions
provide one-on-one time for discussion
negotiation
other
98
Closed-ended questions
Example − rating scale
To what extent do you agree or
disagree with the new zoning code?
(Circle one.)
1 Strongly disagree
2 Mildly disagree
3 Neither agree or disagree
4 Mildly agree
5 Strongly agree
99
When wording the questions, consider…
100
Use clear, specific, simple wording.
101
Example:
Use clear, specific, simple wording.
102
Include all necessary information.
103
Example: Vague questions
Vague:
How will this seminar help you?
104
Example: Vague questions
Better:
What skills did you learn in this
seminar that will help you follow the
child custody arrangements set by
the court?
how to negotiate changes or with my ex-spouse
how to explain visitation arrangements to my
children
steps to requesting a change in arrangements
from the court
how to separate child support from visitation
disputes
105
Example: Avoid ambiguous words or
phrases.
Ambiguous:
How has your child demonstrated
improved communication skills
since participating in “Let’s
Communicate”?
106
Example: Avoid specificity that limits the
potential for reliable responses.
Too specific:
How many meals have you eaten as a
family during the past year?
number of meals
107
Example: Avoid making assumptions.
108
Avoid leading questions.
• Biased questions
– Influence people to respond in a certain
way
– Make assumptions about the respondent
– Use language that has strong positive or
negative appeal
109
Example: Leading questions
Leading:
Do you think this seminar will help
you stop fighting with your spouse
about the children?
Better:
How do you think this seminar will
help you work with your spouse to
address your children’s concerns?
110
Avoid double-barreled questions.
111
Example: Double-barreled question
Double:
How will this seminar help you communicate
better with your children and their
grandparents about your divorce?
Better:
How will this seminar help you communicate
with your children about your divorce?
How will this seminar help you communicate
with your children’s grandparents about their
relationship with their grandchildren?
112
Make the response categories clear,
logical, and mutually exclusive.
113
Example: Clear, logical, and mutually
exclusive responses
Poor spacing and Better spacing, logic, and
logic: mutually exclusive:
Children’s Ages Children’s Ages
0−1 under 1 year of age
1−3 1−3 years of age
3−6 4−6 years of age
7−12 7−9 years of age
13−18 10−12 years of age
13−15 years of age
16−18 years of age
114
Example: Vague quantifier
Vague:
How often did you attend an
Extension-sponsored workshop
during the past year?
a. Never
b. Rarely
c. Several times
d. Many times
115
Example: Vague quantifier
Better:
How often did you attend an
Extension-sponsored workshop
during the past year?
a. Not at all
b. One to two times
c. Three to five times
d. More than five times
116
Rating scales
• Ordered options to gauge difference of
opinion.
• Keep the order of choices the same
throughout the form.
• Odd number of options allows people
to select a middle option.
• Even number forces respondents to
take sides.
• Simpler is better.
117
Types of rating scales
Category scales
Numeric scales
Semantic differentials
118
Category/Rating scales
119
Category/Rating scales
• Balance the scale with an equal
number of positive and negative
options.
• “No opinion” or “uncertain” are not
part of a scale. They are usually
placed off to the side or in a separate
column.
• All choices should refer to the same
thing/concept.
120
Category/Rating scales − Example
Poor: Better:
Not worth my time Not at all
Slightly interested interested
Moderately Slightly interested
interested Moderately
Very interested interested
Very interested
121
Rating scales − Words
122
Rating scales − Words
123
Rating scales − Words
124
Formatting considerations
• Overall appearance
• Length of the questionnaire
• Order of questions
• Demographic data collection
125
Formatting − Overall appearance
126
Formatting − Length of the questionnaire
127
Formatting − Order of questions
• Introduction − Include questionnaire’s
sponsor, purpose, use, confidentiality, etc.
• Include instructions for how to answer the
questions (e.g., Circle one; Check all that
apply).
• Arrange questions so they flow naturally.
• Place demographic questions at the end of
the questionnaire.
• Be consistent with numbers, format, and
scales.
128
Formatting − Order of questions
129
Formatting − Demographic data collection
131
Pre-test the questionnaire
ALWAYS
ALWAYS
ALWAYS
132
What you want to find out in a pretest:
• Does each question measure what it is
supposed to measure?
• Are all the words understood?
• Are questions interpreted in the same
way by all respondents?
• Are all response options appropriate?
• Is there an answer that applies to each
respondent?
-Salant and Dillman (1994)
Source: Salant, P., & Dillman, D. A. (1994). How to conduct your own survey. New York: John Wiley & Sons, Inc.
133
Pre-testing questions
• Are the answers respondents can choose
from correct? Are some responses
missing?
• Does the questionnaire create a positive
impression – does it motivate people to
answer it?
• Does any aspect of the questionnaire
suggest bias?
• Do respondents follow the directions?
• Is the cover letter clear?
-Salant and Dillman (1994)
Source: Salant, P., & Dillman, D. A. (1994). How to conduct your own survey. New York: John Wiley & Sons, Inc.
134
Pre-testing steps
1. Select reviewers who are similar to
the respondents and who will be
critical. (Also ask your colleagues to
review it.)
2. Ask them to complete the
questionnaire as if it were “for real.”
3. Obtain feedback on the form and
content of the questionnaire and the
cover letter. Was anything confusing,
difficult to answer, de-motivating?
135
Pre-testing steps, continued
136
Revise and revise…
137
Choices: Timing of data collection
138
A good cover letter will include
information about…
• Purpose and importance of the
survey
• Survey sponsor − use letterhead
• Why the respondent was selected to
participate
• Benefit(s) of completing survey
• Assurance of anonymity or
confidentiality
139
A good cover letter will include
information about…
140
Cover letters − Tips
141
Cover letters – Pre-test
142
Sampling – The basics
• Why sample?
• What are options for sampling
design?
• What determines sample size?
• What should you consider when
conducting the sample?
143
Why a sample?
• Save money.
• Save time.
• Minimize error and maximize
representation.
144
Types of sampling strategies:
Probability: Nonprobability:
• Why? • Why?
Generalize to Generalizability not
as important. Want
population.
to focus on “right
Some examples: cases.”
– Simple random Some examples:
sample – Quota sample
– Stratified sample – “Purposeful” sample
– Cluster sample – “Convenience” or
– Systematic sample “opportunity”
sample
145
Things to remember
• The smaller the sample size, the greater the
variability.
• Goal of sampling: to reduce variability enough to say
it represents the population without increasing costs.
• Sampling bias: Sampling is not “really” random, but
ion a good sample:
1. Each unit should have an equal chance of
being chosen.
2. Choosing one unit should not affect whether
another is chosen.
• Response bias: Respondents with particular
characteristics tend to respond in particular ways.
146