Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
LEARNING FROM IMPLEMENTATION
RONALD MUTASA
JED FRIEDMAN
WENCELAS NYAMAYARO
PATRON MAFAUNE
CHENJERAI SISIMAYI
EUBERT VUSHOMA
BERNARD MADZIMA
March 2014
Rationale for learning from implementation
 Fast changing implementation landscape
 Decentralized implementation arrangements
 Need to go beyond numbers – positivist paradigm (quantity or quality score indicators)
 Capture rich experiences and lessons from frontlines of PBF implementation & context
 Community engagement and support (HCC)
 Geographic influences (supervision aspect, cross-catchment area patient movement)
 Health facility management skills and dynamism
 Extent of mentorship and clinical supervision by district/province
 Striving for excellence by learning from and improving the design
 Supporting impact evaluation and not substituting it –better understanding of the our
intervention
2
Process & Impact Evaluation
Answering Policy and Operationally Relevant Questions
3
3/29/2014
Implementation & Learning Platforms
4
Baseline
(2011)
Midline Impact Evaluation
(Mar-Sept 2014)
Program Inception
Endline IE
(TBD)
Process Monitoring
and Evaluation (PME)
(November 2013)
Routine Performance
Review (Quarterly) –
Operational Data
Technical Review
(June 2012)
Mid-Term Review
(February 2013)
Technical Adjustments: Prices and Services
Technical Modifications –clinical quality,
streamlining verification, equity monitoring
2nd
PME Round
Planned for
November 2014
Analysis of Quality Sub-component Performance
0
20
40
60
80
100
Maternity
service
Medicine stock
management
Family and
child health
Referral
services
Community
services
environment
health
Quarter 2 - 2012
0
20
40
60
80
100
Maternity service
Medicine stock
management
Family and child
health
Referral services
Community services
environment health
Quarter 3 - 2013
• Selected component scores show improvement between the first and
latest quarters
5
3/29/2014
Geographical Spread of Performance
• General upward trend in the
performance of indicators
acrossthe districts
• Example: Shows all district
experiencea strong positive
trend in ANC4+ but 8 districts
demonstratingsignificantly
higher positive slopes of
change
6
3/29/2014
Moving beyond the ‘Black Box’
PBF Investments Program
Outcomes
Process Monitoring & Evaluation
 Help validate program model and the results
 A tool to avoid failure due to lack of “fidelity.”
 Evidence for mid-course adjustments to program design and
implementation
 Supplying rapid feedback about operations and outcomes that guide
program evolution
 Explain contextual factors that matter the most & account for variation in
provider performance
 Deepening policy and donor dialogue through in-depth understanding of
the intervention (1st & 2nd generation PBF issues)
PME Approach
 Sequential mixed method deployed
 Quantitative data was collected and analyzedto characterizethe performance
level of the health facilities.
 Health facilities fitting pre-identifiedcriteria (high, medium and low performers)
sampled to gather rich contextual qualitative data
 Wherever relevant and feasible, the quantitative data (on the trends of service
utilization, etc.) was used to triangulate with the qualitative data on the
implementation and performance of RBF.
 Key informants
 Health Center Committees, Health Workers, Head of Health Facilities, District Health
Executives (Management Team) & communities within catchment areas
3/29/2014
9
Moving beyond the ‘Black Box’ to the
‘Transparent Box’
PBF Investments Program
Outcomes
Validated Conceptual Framework
3/29/2014
11
Take Home Messages
 Learning from implementation pays huge dividends
in PBF (e.g. strategic purchasing; depth & focus of
supervision)
 Enables greater understanding of demand-
side/community dimensions
 Enhances realization of the full effect of PBF
 Context, context, context!!
3/29/2014
12
THANK YOU
13
3/29/2014

More Related Content

Annual Results and Impact Evaluation Workshop for RBF - Day Four - Learning from Implementation - Zimbabwe

  • 1. LEARNING FROM IMPLEMENTATION RONALD MUTASA JED FRIEDMAN WENCELAS NYAMAYARO PATRON MAFAUNE CHENJERAI SISIMAYI EUBERT VUSHOMA BERNARD MADZIMA March 2014
  • 2. Rationale for learning from implementation  Fast changing implementation landscape  Decentralized implementation arrangements  Need to go beyond numbers – positivist paradigm (quantity or quality score indicators)  Capture rich experiences and lessons from frontlines of PBF implementation & context  Community engagement and support (HCC)  Geographic influences (supervision aspect, cross-catchment area patient movement)  Health facility management skills and dynamism  Extent of mentorship and clinical supervision by district/province  Striving for excellence by learning from and improving the design  Supporting impact evaluation and not substituting it –better understanding of the our intervention 2
  • 3. Process & Impact Evaluation Answering Policy and Operationally Relevant Questions 3 3/29/2014
  • 4. Implementation & Learning Platforms 4 Baseline (2011) Midline Impact Evaluation (Mar-Sept 2014) Program Inception Endline IE (TBD) Process Monitoring and Evaluation (PME) (November 2013) Routine Performance Review (Quarterly) – Operational Data Technical Review (June 2012) Mid-Term Review (February 2013) Technical Adjustments: Prices and Services Technical Modifications –clinical quality, streamlining verification, equity monitoring 2nd PME Round Planned for November 2014
  • 5. Analysis of Quality Sub-component Performance 0 20 40 60 80 100 Maternity service Medicine stock management Family and child health Referral services Community services environment health Quarter 2 - 2012 0 20 40 60 80 100 Maternity service Medicine stock management Family and child health Referral services Community services environment health Quarter 3 - 2013 • Selected component scores show improvement between the first and latest quarters 5 3/29/2014
  • 6. Geographical Spread of Performance • General upward trend in the performance of indicators acrossthe districts • Example: Shows all district experiencea strong positive trend in ANC4+ but 8 districts demonstratingsignificantly higher positive slopes of change 6 3/29/2014
  • 7. Moving beyond the ‘Black Box’ PBF Investments Program Outcomes
  • 8. Process Monitoring & Evaluation  Help validate program model and the results  A tool to avoid failure due to lack of “fidelity.”  Evidence for mid-course adjustments to program design and implementation  Supplying rapid feedback about operations and outcomes that guide program evolution  Explain contextual factors that matter the most & account for variation in provider performance  Deepening policy and donor dialogue through in-depth understanding of the intervention (1st & 2nd generation PBF issues)
  • 9. PME Approach  Sequential mixed method deployed  Quantitative data was collected and analyzedto characterizethe performance level of the health facilities.  Health facilities fitting pre-identifiedcriteria (high, medium and low performers) sampled to gather rich contextual qualitative data  Wherever relevant and feasible, the quantitative data (on the trends of service utilization, etc.) was used to triangulate with the qualitative data on the implementation and performance of RBF.  Key informants  Health Center Committees, Health Workers, Head of Health Facilities, District Health Executives (Management Team) & communities within catchment areas 3/29/2014 9
  • 10. Moving beyond the ‘Black Box’ to the ‘Transparent Box’ PBF Investments Program Outcomes
  • 12. Take Home Messages  Learning from implementation pays huge dividends in PBF (e.g. strategic purchasing; depth & focus of supervision)  Enables greater understanding of demand- side/community dimensions  Enhances realization of the full effect of PBF  Context, context, context!! 3/29/2014 12