Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
ALMM monitoring and evaluation
tools
Alberto Cerda
Key expert on ALMM monitoring and evaluation
First meeting of the Working Group 2
June 24, 2010
EUNES IPA Project
Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service
Europeaid/128079/C/SER/RS
Monitoring
The types of information necessary
− Programme inputs
− Progress against objectives and Implementation
Plan
− Results of activities and outputs
− Impact on the target group
− The way the programme is managed and style of
work
The means of gathering information: site visits;
interviews, observation, analysis of activity reports,
and financial documents.
Performance Monitoring System- levels
Impact monitoringProcess monitoring
Performance Monitoring System..provides the basis for the kind of management information
system which is essential for programme operations, especially
in situations where implementation is delegated or
decentralised to local level.
The Monitoring System for an ALMM-funded project- key
documents:

Quarterly monitoring report

Quarterly financial report

Participant start and completion details
Focus on:
target group(s) analysis, costs, completion rate, qualifications
obtained as a result of participation, employment status after
completion.
Monitoring vs. Evaluation
Monitoring

assess the success or failure of
the programme

provides rapid information

monitoring for quality control
and procedural purposes

effectiveness of the ALMPs-
secondary effects taken into
account
Evaluation

provides explanations

longer term process

determins whether and why a
programme is successful

assessing implementation take
place at all stages of the
programme

makes use of monitoring
data
Evaluation Objectives
To find out...
• progress towards objectives
• beneficiaries from intervention
• impact on the beneficiaries
• changes to the target group
To make recommendations about:
• programme improvements;
• Revision of the aims and objectives;
• future work evaluation;
• cost-effectiveness
To asess..
• cause of impact
• aims and objectives
• work and resources efficiency
• changes in the needs of the target group
Programme Evaluation
Partly statistical exercise
Statistics do not
provide full assesment
Good evaluation involves
evidence based interpretation
Elements of qualitative
analysis
Evaluation Elements: Outcomes and Process
Outcomes- what was achieved and with what results?
Estimated impacts of the programme on the individual

Impacts and yield net social gains

Outcome for the money spent

Feasibility of replicating programme’s outcomes.
Process- how the outputs were achieved, how the programme
was managed?

programme design and methodology

programme management

service delivery mechanisms

the quality of the co-operation with partner organisations
innovation
Evaluation: How and Who?
Evaluation - the ‘How’?

Who should undertake the evaluation?

When should evaluation take place?

What should be evaluated?

How is an evaluation conducted?
Who Should Undertake the Evaluation?

self-evaluation

external evaluation
Self-Evaluation
• probably conducted by most
programmes
• staff independent from its
management
• time and other resources
• available data
• research methods and data
analysis
• progress of the programme
• interpretation of the
information
External Evaluation
• expert services
• cost-effectiveness
• Access to honest information
from the staff
• other organisations take the
results seriously.
• evaluation within a wider context
• decision- Terms of Reference
When Should Evaluation Take Place?
Two intervals:

at an interim stage,

at the end of the approved period for the programme.
The monitoring data you collect are a key source of information
for both interim and final evaluations.
How Is An Evaluation Conducted?
PLAN
ANALYSE
INDICATORS
ANALYSE
DATA
REPORT
Planning the evaluation- key points

Focus of the evaluation

Programme objectives

Programme products

Programme processes

Subject, purpose and audience targeted
Analyse the Performance indicators
Performance indicators:

Output indicators – quantitative measures, based on your
programme's key targets and objectives

Process indicators –quantitative or qualitative
Gather and analyse the data
Questions to ask are:

Is the information already available?

Do you need to establish a baseline?

What gaps are there in the data?

What do you need to collect in addition to what exists?

How will the information be gathered and recorded?

Are these procedures feasible?
Methods are used to gather the data:record-keeping,
observation of participants, self-administered questionnaire,
individual interviews, group discussion/ focus groups.
Reporting of the findings
An interim evaluation report cover:

programme background;

achievements and problems;

trends and issues; and

action needed.
The final evaluation report cover:

the background to the programme;

the methods used to gather and analyse the information;

the results / findings, evidence, lessons;

recommendations.
Dissemination
The evaluation will be of interest to a wide audience, including:

beneficiary groups;

local, regional and national social partner organisations;

policy-makers and representatives from intermediary
organisations
Take into account:

the contents tailored to audience

only necessary information and evidence

focus on key findings

transparency

reference all external sources; and

Summary needed
Thank you for the attention!
Alberto Cerda
Key expert on ALMM monitoring and evaluation
acm.eunes@gmail.com
EUNES IPA Project
Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service
Europeaid/128079/C/SER/RS

More Related Content

Almm monitoring and evaluation tools draft[1]acm sir revised

  • 1. ALMM monitoring and evaluation tools Alberto Cerda Key expert on ALMM monitoring and evaluation First meeting of the Working Group 2 June 24, 2010 EUNES IPA Project Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service Europeaid/128079/C/SER/RS
  • 2. Monitoring The types of information necessary − Programme inputs − Progress against objectives and Implementation Plan − Results of activities and outputs − Impact on the target group − The way the programme is managed and style of work The means of gathering information: site visits; interviews, observation, analysis of activity reports, and financial documents.
  • 3. Performance Monitoring System- levels Impact monitoringProcess monitoring
  • 4. Performance Monitoring System..provides the basis for the kind of management information system which is essential for programme operations, especially in situations where implementation is delegated or decentralised to local level. The Monitoring System for an ALMM-funded project- key documents:  Quarterly monitoring report  Quarterly financial report  Participant start and completion details Focus on: target group(s) analysis, costs, completion rate, qualifications obtained as a result of participation, employment status after completion.
  • 5. Monitoring vs. Evaluation Monitoring  assess the success or failure of the programme  provides rapid information  monitoring for quality control and procedural purposes  effectiveness of the ALMPs- secondary effects taken into account Evaluation  provides explanations  longer term process  determins whether and why a programme is successful  assessing implementation take place at all stages of the programme  makes use of monitoring data
  • 6. Evaluation Objectives To find out... • progress towards objectives • beneficiaries from intervention • impact on the beneficiaries • changes to the target group To make recommendations about: • programme improvements; • Revision of the aims and objectives; • future work evaluation; • cost-effectiveness To asess.. • cause of impact • aims and objectives • work and resources efficiency • changes in the needs of the target group
  • 7. Programme Evaluation Partly statistical exercise Statistics do not provide full assesment Good evaluation involves evidence based interpretation Elements of qualitative analysis
  • 8. Evaluation Elements: Outcomes and Process Outcomes- what was achieved and with what results? Estimated impacts of the programme on the individual  Impacts and yield net social gains  Outcome for the money spent  Feasibility of replicating programme’s outcomes. Process- how the outputs were achieved, how the programme was managed?  programme design and methodology  programme management  service delivery mechanisms  the quality of the co-operation with partner organisations innovation
  • 9. Evaluation: How and Who? Evaluation - the ‘How’?  Who should undertake the evaluation?  When should evaluation take place?  What should be evaluated?  How is an evaluation conducted? Who Should Undertake the Evaluation?  self-evaluation  external evaluation
  • 10. Self-Evaluation • probably conducted by most programmes • staff independent from its management • time and other resources • available data • research methods and data analysis • progress of the programme • interpretation of the information External Evaluation • expert services • cost-effectiveness • Access to honest information from the staff • other organisations take the results seriously. • evaluation within a wider context • decision- Terms of Reference
  • 11. When Should Evaluation Take Place? Two intervals:  at an interim stage,  at the end of the approved period for the programme. The monitoring data you collect are a key source of information for both interim and final evaluations.
  • 12. How Is An Evaluation Conducted? PLAN ANALYSE INDICATORS ANALYSE DATA REPORT
  • 13. Planning the evaluation- key points  Focus of the evaluation  Programme objectives  Programme products  Programme processes  Subject, purpose and audience targeted
  • 14. Analyse the Performance indicators Performance indicators:  Output indicators – quantitative measures, based on your programme's key targets and objectives  Process indicators –quantitative or qualitative
  • 15. Gather and analyse the data Questions to ask are:  Is the information already available?  Do you need to establish a baseline?  What gaps are there in the data?  What do you need to collect in addition to what exists?  How will the information be gathered and recorded?  Are these procedures feasible? Methods are used to gather the data:record-keeping, observation of participants, self-administered questionnaire, individual interviews, group discussion/ focus groups.
  • 16. Reporting of the findings An interim evaluation report cover:  programme background;  achievements and problems;  trends and issues; and  action needed. The final evaluation report cover:  the background to the programme;  the methods used to gather and analyse the information;  the results / findings, evidence, lessons;  recommendations.
  • 17. Dissemination The evaluation will be of interest to a wide audience, including:  beneficiary groups;  local, regional and national social partner organisations;  policy-makers and representatives from intermediary organisations Take into account:  the contents tailored to audience  only necessary information and evidence  focus on key findings  transparency  reference all external sources; and  Summary needed
  • 18. Thank you for the attention! Alberto Cerda Key expert on ALMM monitoring and evaluation acm.eunes@gmail.com EUNES IPA Project Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service Europeaid/128079/C/SER/RS

Editor's Notes

  1. Process monitoring – looks at the use of resources, the progress of activities, and the way these are carried out. Impact monitoring - tracks progress in delivery of the expected end-results of the programme, using the previously developed Performance Indicators (PIs), and the impact the programme is having on target groups.
  2. Evaluation is partly a statistical exercise, but statistics (such as numbers of people enrolled on a pilot training programme) do not in themselves provide a full assessment. A good evaluation involves evidence-based interpretation, and usually an element of qualitative analysis, such as learners’ surveys, discussions with beneficiaries or trainers.
  3. Two elements of the programme therefore should be evaluated, the outcomes and the process. According to one experienced commentator, ALMP impact evaluations done in most OECD countries typically answer only the first question. Some impact evaluations consider partially the second question, whereas none consider the third.8
  4. Both types of evaluation (self and external) can be applied to the same programme, though they may be carried out at different times, or address different questions or issues.
  5. The starting point for the evaluation will be some basic questions such as:  What were the expected outcomes of this programme, what are the actual outcomes, and how do they compare?  What critical success factors can be associated with these outcomes?  What processes could be used to measure progress against these factors?  Who will carry out these processes (external or internal personnel)?  What will be the outcome of the process (e.g. a report, a presentation)?
  6. Often, an important part of an ALMM programme is to get successful activities, outcomes, products and lessons taken up on a wider basis. This is termed 'mainstreaming’. It involves policy-makers and practitioners at local, regional, or national level using your programme’s outcomes to inform their actions. Dissemination activities will play a large part in enabling this to take place. Your evaluation report may contain important lessons (both positive and negative) for future policy and practice far beyond the domain of your project itself (e.g. higher level strategic managers, policy analysts, counterparts in other regions of Serbia or neighbouring countries). Drawing out the lessons from programme ‘failure’ can be useful in informing future developments, so that pitfalls can be avoided and success maximised. In any case, whether a programme is of good or of bad quality, a frank and independent evaluation of the processes and products is vital.