Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

General Principles: Evaluation

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

General Principles

Harmonized approaches of Health System Strengthening M&E that includes the following:
1. Reduced transaction costs
2. Increased efficiency
3. Diminished pressure on doing a project

➤In the context of HSS, includes not only reduced mortality, but also reduced morbidity,
improved equity, protection from financial risks and responsiveness.
Challenge
1. WHO Framework for health systems
jConsensus of monitoring and evaluation
performance assessment (2000)
must address performance in terms of
2. World Bank control knobs framework both healthy system measures and
(2004) population health measures

3. WHO building blocks frameworks (2006)


Monitoring
• It is the routine checking of information on progress, and to confirm that the progress is
occurring against the defined direction.
• Used to ensure that what has been planned is going forward as intended and within the
resources allocated.
• PURPOSE: To track implementation progress through periodic data collection (CDC,
2021)
• GOAL: provide early indicators of progress (CDC, 2021)
Evaluation
• Used to ensure that the direction chosen is correct whereby the correct combination of
strategies and resources are utilized to get there.
• It can be formative or summative that focuses on the outcomes and their relationship with
the outputs.
• PURPOSE: Determine effectiveness of a specific program or model and understand why a
program may or may not be working (CDC, 2021)
• GOAL: to improve programs for implementation (CDC, 2021)
Steps in Evaluation Practice Standards for Effective Evaluation
Engage stakeholders
Those involved, those affected, primary Intended
users
Describe the program Utility
Need, expected effects, activities, resources, stage, Serve the information needs of intended users
context, logic model
Focus the evaluation design Feasibility
Purpose, users, uses, questions, methods, Be realistic, prudent, diplomatic, and frugal
agreements
Gather credible evidence Propriety
Indicators, sources, quality, quantity, logistics Behave legally, ethically, and with due regard for the
welfare of those involved and those affected
Justify conclusions Accuracy
Standards, analysis/synthesis, Interpretation, Reveal and convey technically accurate Information
judgment. recommendations
Ensure use and share lessons learned
Design, preparation, feedback, follow-up,
dissemination

Monitoring and evaluation framework


Framework purposes:
1. Monitoring of programme inputs, processes and results, required for management of health
systems investments.
2. Health systems performance assessments, as the key for project decision making
3. Evaluating the results of the health reform investments and identify which approaches work
best.
NOTE: the framework should not only facilitate the identification of core indicators but also
connect indicators to data sources and data collection methods.
M&E FRAMEWORKS
• Outlines what is needed across the results chain in terms of tools for data quality assurance,
synthesis and analysis with focus on building project capacities.
• Addresses the importance of dissemination, communication and use of monitoring and
evaluation results ton inform policy making at all levels.
Four Major Indicator domains
• System inputs, processes and out puts reflect health systems capacity.
• Outputs, outcomes and impact are results of investments and reflect health systems
performance. Each block of indicators, preferred and alternative data sources are
recommended.
• The direction of the framework should be from immediate to longer term.
Input Domain/indicator
Measure the contribution necessary to enable the program to be implemented
Examples:
1. Funding 2. Staff 3. Key partners 4. Infrastructure
Process domain/indicator
• Measure the program's activities and outputs indicate whether the program is being
implemented as planned.
• The production of strong outputs is the sign that the program's activities have been
implemented correctly.
Output Domain/Indicator
• Measures the products, capital goods and services which result from a development
intervention
• Includes changes resulting from the intervention which are relevant to the achievement of
outcomes
Outcome domain/indicator
• Measure whether the program is achieving the expected effects/changes in the short,
intermediate and long term.
• Measure the changes that Occur over time, that includes the baseline (before the project
begins) and at the of the project
Impact domain/indicator
• It is the positive and negative, primary and secondary long-term effects produced by
development intervention, directly or indirectly, intended or unintended result of the
outcome.
Use of Core indicators
• Identification of indicators should be measured and amendable towards setting of targets,
and appropriate metadata for the indicators.
• GOAL: identify a comprehensive list of core indicators that capture all areas framework.
• CHALLENGE: ensure an appropriate balance across the full range of the M&E
Use of Core Indicators:
Indicators have been selected using the following criteria:
1. Addresses all aspects of health systems performance and cover each domain along the results
chain
2. They draw upon existing indicator lists, including the SDGs, countdown, program indicators of
health sector performance
3. Scientifically accessible, and SMART robust, useful, understandable
Data Sources
Sources of health data can be divided into two Broad Groups:
1. Data generated relative to populations as a whole
2. Data generated as an outcome of health-related administrative and operational activities.
NOTE: Other information from health research, clinical trials and longitudinal community
studies may also feed into the health information system.
• GOAL: all Community programs have in place the range of data sources needed to generate
critical data sets.
• CHALLENGE: ensure that there is an appropriate mix of data sources to ensure that data
sets and core indicators can be generated to high standards of quality and efficiency.
Data analysis and synthesis
• First step involves systematic data quality assessment
• Must be transparent and in line with international standards.
• Identifying and accounting for biases because of incomplete reporting, inaccuracies and
non representativeness
NOTE: Efficacy can be assessed through analysis of inputs with results in terms of outputs,
outcomes and impact.

➤GOAL: To establish the effectiveness of the interventions whereby cost-effectiveness analysis


is essential to draw the ultimate conclusions.
➤CHALLENGE: Most countries do Data collection not have systematic way in which data and
statistics and qualitative information are brought together.

Data dissemination, communication and use


• Translation of data into information relevant for decision making
• Requires packaging, communication and dissemination of statistics in a format and
language accessible to the higher level policy and decision makers.
• Effective communication of results using multiple media, including dashboards, targets
and benchmarking.

Application of M&E framework

1. Evaluation
• Large-scale evaluation studies of complex interventions often use a stepwise approach to
link trends in health outcomes, coverage and risk behaviors, access and quality of
services, and funding, as randomizations are not possible.
• Time series and dose response analyses are used to explain changes over time and
attribute to specific investments, often complemented by modelling.

2. Immunization tracking
• Every national M&E plan will already include indicators of immunization coverage and
child mortality among the core set of priority indicators measured and reported against on
a regular basis.
• Efforts to improve M&E of immunization programmes - particularly in the context of
results-based financing - need high quality data and systematic review processes that
should be integrated into national M&E systems, linked to national plans.
• basis for effective use of immunization data in national review processes and for global
reporting.

3. HIV/AIDS tracking
• Some of those investments have gone into strategic information, including surveys,
surveillance, facility based information systems and programme monitoring & evaluation,
although its impact on strengthening of the country systems has been limited.
• Future investments should improve the availability of quality data and strengthen AIDS
M&E in a way that it increasingly benefits the health information system.
• The M&E framework approach provide a platform for better integration of HIV monitoring
and evaluation systems into a country health information system, while still ensuring
complete and accurate reporting for key HIV interventions.

4. Performance-based funding and tracking results


• It will also be important that investments in improving data availability and quality for
performance-based funding are made in a way that the general system of country health
systems surveillance improves, including better data for annual health sector reviews and
other key decision making processes.
• Routine monitoring of performance based funding initiatives should track over time the
level of discrepancy between administrative data and independent data sources, such as
household surveys.

You might also like