M & E Test
M & E Test
M & E Test
MOniTORinG MiD-TERM OR FinAL EVALuATiOn Provides information enabling Relies on more detailed data (e.g., management staff to assess from surveys or studies) in addition to implementation progress and make that collected through the monitoring timely decisions. system to understand the project in greater depth. Is concerned with verifying that project Assesses higher level outcomes activities are being undertaken, and impact and may verify some of services are being delivered, and the findings from the monitoring. the project is leading to the desired Evaluations should explore both behavior changes described in the anticipated and unanticipated results. project proposal. Is an internal project activity. Can be externally led (particularly endof-project evaluations), though they should involve the active participation of project staff. Is an essential part of good day-to-day Is an essential activity in a longer-term management practice. dynamic learning process. Is an essential part of day-to-day Is important for making decisions on management and must be integrated overall project direction. within the project management structure. Takes place during the implementation Occurs at predetermined points phase. during implementation. Other smaller
evaluations may be undertaken to meet specific information needs throughout the process. Generally focuses on the question Are Generally focuses on the question Are we doing things right? we doing the right thing?
Monitoring Question Examples How many people or communities were reached or served?Were the targeted numbers reached?
Evaluation Question Examples How adequate was program reach?Did we reach enough people? Did we reach the right people? How well was the program implemented? Fairly, ethically, legally, culturally appropriately, professionally, efficiently?For outreach, did we use the best avenues and methods we could have?How well did we access hard-toreach and vulnerable populations? Did we reach those with the greatest need? Who missed out, and was that fair, ethical, just? How substantial and valuable were the outcomes?How well did they meet the most important needs and help realize the most important aspirations?Should they be considered truly impressive, mediocre, or unacceptably weak? Were they not just statistically significant, but educationally, socially, economically, and practically significant? Did they make a real difference in peoples lives?
How was the program Process (Design & implemented?Was Implementation) implementation in accordance with design and specifications?
What has changed since (and as a result of) program implementation?How much have outcomes changed relative to targets?
Were the outcomes worth achieving given the effort and investment put into obtaining them?
3.
2
Determine Monitoring and Evaluati on Questions, Indicators, and Their Feasibility
In this step, monitoring and evaluation specialists and program managers identify the most important evaluation questions, which should link directly to the stated goals and objectives. Questions should come from all stakeholders, including the program managers, donors, and members of the target populations. The questions should address each grou ps concerns, focusing on these areas: What do we want to know at the end of this program? and What do we expect to change by the end of this program? Framing and prioritizing monitoring and evaluation questions is sometimes difficult, especially when resources, time, and expertise are limited and where multiple stakehol ders are present. Monitoring and evaluation questions may require revision later in the plan development process.
Handout Core Module 3: Appendix: page 5 Developing a Monitoring and Evaluation Work Plan
3
Determine Monitoring and Evaluati on MethodologyMonitoring the Process and Evalua ting the Effects
This step should include the monitoring and evalua tion methods, data collection methods and tools, analysis plan, and an overall timeline. It is crucial to clearly spell out how data will be collected to answer the monitoring and evaluation questions. The planning team determines the appr opriate monitoring and evaluation methods, outcome measures or indicators, information needs, and the methods by which the data will be gathered and analyzed. A plan must be developed to collect and process data and to maintain an accessible data system. The plan should address the following issues: What information needs to be monitored? How will the information be collected? How will it be recorded? Ho w will it be reported to the central office? What tools (forms) will be needed? For issues that require more sophisticated data collection, what study design will be used? Will the data be qualitative, quantitative, or a combination of the two? Which outcomes will be measured ? How will the data be analyzed and disseminated?
4
Resolve Implementation Issues: Who Will Conduct Monitoring and Evaluation? How Will Existing Monito ring and Evaluation Results and Past Findings Be Used?
Once the data collection methods are established, it is important to clearly state who will be responsible for each activity. Program managers must decide: How are we really going to implement this plan? Who will report the process data and who will collect and analyze it? Who will oversee any quantitative data collection and who will be respon sible for its analysis? Clearly, the success of the plan depends on the technical capacity of program staff to carry out monitoring and evaluation activities. This invariably requires technical assist ance. Monitoring and evalua tion specialists might be found in planning and evaluation units of the Nation al AIDS Commission, National AIDS and STD Control Program, Census Bureau, Office of Statistics, multis ectoral government ministri es including Ministry of Health, academic institutions, non-governmental organizations, and private consulting firms. It is important to identify existing data sources and other monitoring and evaluation activities, whether they have been done in the past, are ongoing, or have been sponsored by ot her donors. At this step, country office monitoring and evaluation speciali sts should determine whether other groups are planning similar evaluations and, if so, invite them to collaborate.
6 Develop the Monitoring and Evaluation Work Plan Matrix and Timeline
The matrix provides a format for presenting the inputs, outputs, outcomes, and impacts and their corresponding activitiesfor each program objective. It summarizes the overall monitoring and evaluation plan by including a list of methods to be used in collecting the data. (An example of a matrix is presented in the appendix.) The timeline shows when each activity in the Monitoring and Evaluation Work Plan will take place.
If you have limited funds for evaluation, you may have to prioritise your evaluation by identifying who are the most important people to report to. Download M&E Audience and Evaluation Questions Template
Is the theory of change reflected in the program logic? How can the program logic inform the research questions?
Evaluation question Does the workshop topic and contents meet the information needs of the target group? To what extent is the intervention goal in line with the needs and priorities of the community? Did the engagement method used in this project lead to similar numbers of participants as previous or other programs at a comparable or lesser cost? Have the more expensive engagement approaches led to better results than the less expensive engagement approaches? To what extent did the workshops lead to increased community support for action to tackle climate change? To what extent was did the engagement method encourage the target group to take part in the project? To what extent has the project led to more sustainable behaviours in the target group? Were there any other unintended positive or negative outcomes from the project? To what extent has the project led to the long-term behaviour change?
Efficiency
Effectiveness
Outcome
Sustainability
If you have developed a program logic, you can use this to start identifying relevant monitoring questions and indicators. Click here to see how. (LINK TO PPT INDENTIFYING MONITORING QUESTIONS FROM PROGRAM LOGIC) Once you have identified monitoring question in your program logic, you can transfer them into your M&E Plan Template. Click here to see how. (LINK TO PPT FROM PROGRAM LOGIC TO M&E)
Step 6. Identify who will evaluate the data, how it will be reported, and when
This step is optional but highly recommended, as it will round off the M&E plan as a complete document. Remembering that evaluation is the subjective assessment of a projects worth, it is important to identify who will be making this subjective assessment. In most cases, it will be the project team, but in some cases, you may involve other stakeholders including the target group or participants. You may also consider outsourcing a particular part of the evaluation to an external or independent party. For an evaluation to be used (and therefore useful) it is important to present the findings in a format that is appropriate to the audience. This may mean a short report, or a memo, or even a poster or newsletter. As such, it is recommended that you consider how you will present your evaluation from the start, so that you can tailor the way to present your findings to the presentation format (such as graphs, tables, text, images).
Focus on the key evaluation questions and the evaluation audience? Capture all that you need to know in order to make a meaningful evaluation of the project? Only asks relevant monitoring questions and avoids the collection of unnecessary data? Know how data will be analysed, used and reported? Work within your budget and other resources? Identify the skills required to conduct the data collection and analysis?