This document provides guidance on designing program evaluations in 3-5 sentences. It discusses clarifying the program's goals and strategy, developing relevant evaluation questions, and selecting an appropriate evaluation design and approach. It also covers identifying appropriate data sources and collection procedures, developing plans to analyze data to allow for valid conclusions, and defining key parts of an evaluation plan such as objectives, information sources, data collection methods, and analysis plans.
Report
Share
Report
Share
1 of 11
More Related Content
Planning and Designing Evaluation
1. Member of:
Devi Oktaviani 10004290
Karia Gawantri 10004362
Marlin Dwinastiti 10004402
Vinita Ajeng S 10004405
Rosi Diana Indah M 10004416
2. ï‚ž Designing Evaluations is a guide to
successfully completing evaluation design
tasks.
ï‚ž Designing Evaluations is one of a series of
papers whose purpose is to provide guides to
various aspects of audit and evaluation
methodology and indicate where more detailed
information is available.
3. ï‚žA program evaluation is a systematic study
using research methods to collect and
analyze data to assess how well a program
is working and why.
ï‚ž Some evaluations attempt to isolate the
causal impacts of programs from other
influences on outcomes, whereas
performance measurement typically does
not.
4. ï‚ž Incontrast, evaluations conducted to
provide an independent assessment of a
program’s strengths and weaknesses
should be conducted by a team
independent of program management.
Evaluations purchased by agencies from
professional evaluation firms can often
be considered independent.
5. 1. Clarify understanding of the program’s goals
and strategy.
2. Develop relevant and useful evaluation
questions.
3. Select an appropriate evaluation approach or
design for each evaluation question.
4. Identify data sources and collection
procedures to obtain relevant, credible
information.
5. Develop plans to analyze the data in ways that
allow valid conclusions to be drawn from the
evaluation questions.
6. 1. the evaluation questions, objectives, and
scope;
2. information sources and measures, or what
information is needed;
3. data collection methods, including any
sampling procedures, or how information or
evidence will be obtained;
4. an analysis plan, including evaluative
criteria or comparisons, or how or on what
basis program performance will be judged
or evaluated;
5. an assessment of study limitations.
7. ï‚ž Designingan evaluation plan is iterative:
evaluation objectives, scope, and
methodology are defined together
because what determines them often
overlaps
8. ï‚ž Implementation evaluations are very
similar to performance monitoring in
assessing the quality and efficiency of
program operations, service delivery, and
service use, except that they are
conducted as separate projects, not
integrated into the program’s daily routine.
ï‚ž Implementation evaluations may be
conducted to provide feedback to program
managers, accountability to program
sponsors and the public, or insight into
variation in program outcomes.
9. 1. Focusing the Evaluation
2. Collecting the Information
3. Using the Information
4. Managing the Evaluation
10. a. Kind of information to be acquired
b. Sources of information (for example, types of
respondents)
c. Methods to be used for sampling sources (for
example, random sampling)
d. Methods of collecting information
e. Timing and frequency of information
collection
f. Analysis plan