TDR Framework 2014 v2 PDF
TDR Framework 2014 v2 PDF
TDR Framework 2014 v2 PDF
Assessment Framework
REVISION—2012–2017
Measuring for improvement
For research on
diseases of poverty
UNICEF • UNDP • World Bank • WHO
TDR Performance
Assessment Framework
REVISION—2012–2017
Measuring for improvement
For research on
diseases of poverty
UNICEF • UNDP • World Bank • WHO
For more information, please contact:
Dr Beatrice Halpaap
WHO/TDR
halpaapb@who.int
TDR/STRA/13.1
TDR PERFORMANCE ASSESSMENT FRAMEWORK 3
Contents
Abbreviations....................................................................................................................................................................................................... 4
Abbreviations
DEC Disease endemic country
GSPA-PHI Global Strategy and Plan of Action on Public Health, Innovation
and Intellectual Property
JCB Joint Coordinating Board
KPI Key performance indicator
LIC Low-income country
MDG Millennium Development Goal
OECD-DAC Development Assistance Committee of the Organization for Economic
Co-operation and Development
PAF Performance Assessment Framework
SAG Special Advisory Group
SC Standing Committee
SMG Senior Management Group
SPT Special Project Team
STAC Scientific and Technical Advisory Committee
TDR Special Programme for Research and Training in Tropical Diseases,
co-sponsored by UNICEF, UNDP, World Bank and WHO
UNEG United Nations Evaluation Group
WHA World Health Assembly
WHO World Health Organization
TDR PERFORMANCE ASSESSMENT FRAMEWORK 5
This Framework is a key element in the implementa- Assessing performance is an ongoing process and
tion of TDR’s 2012–2017 strategy1. It has the following this framework is continuously being reviewed
objectives: and refined in order to address the needs of the
• Promote continuous performance improvement Programme to achieve its objectives. It outlines the
through organizational review, learning and in- proposed framework in the context of the current
formed decision-making. systems in place to review TDR's performance and
contains four parts:
• Enhance accountability to stakeholders, including
beneficiaries, partners and resource contributors. • Part I describes the purpose, proposed approaches
and principles of performance assessment in TDR.
• Ensure strategic relevance and coherence of TDR's
It defines the different levels and specific areas of
activities to meet the aspirations expressed in the
assessment.
vision, mission and strategy.
• Part II presents TDR's expected results and the
• Ensure TDR’s performance assessment is harmo-
key performance indicators identified to measure
nized and consistent with international practices.
progress and reflect the Programme's perfor-
mance.
An initial framework was developed in 2009 in
• Part III describes the current process for monitoring
consultation with TDR staff, WHO research-related
and evaluating this performance.
programmes and regional offices and TDR's co-
sponsors, as well as external advisers from research • Part IV explains how monitoring and evaluation
and training funding institutions, development agen- findings are utilized for organizational learning
cies, research institutions and individual researchers and performance improvement.
from disease endemic countries (DECs), as shown
in Annex 1. Terms adopted by TDR are listed at the end of this
document2. Annex 2 provides a summary of the
The framework is a tool used by both TDR staff various reporting instruments. The TDR monitoring
and a broad range of stakeholders involved in the and evaluation matrix is presented in Annex 3.
governance and implementation of TDR's strategy. For each key performance indicator it lists: (i) the
It promotes and guides the systematic assessment specific achievement target; (ii) baseline data
of TDR’s strategic and technical relevance and representing the situation at the beginning of the
contribution towards its vision and mission, and it reference period; (iii) the source of verification; (iv)
clarifies how performance assessment at various who is responsible to conduct the measurement;
levels fit together into one integrated system. and (v) when the measurement needs to be made.
TDR's vision is for “the power of research and A suitable system to assess performance allows
innovation to improve the health and well-being of for cost-efficient and real-time measurement
those burdened by infectious diseases of poverty”. and monitoring of progress indicators to inform
decision-making. Aligned with the new TDR
For that purpose, TDR has set its mission to “foster strategy, the current revision of the framework
an effective global research effort on infectious further demonstrates TDR’s focus on health impact
diseases of poverty and promote the translation and value for money throughout the whole results
of innovation to health impact in disease endemic chain, from using resources carefully to building
countries”. efficient processes, to quality of outputs, and to the
sustainability of outcomes (Fig. 1).
IMPACT GOALS
Organizational
learning
10 PART 1 - TDR PERFORMANCE ASSESSMENT FRAMEWORK
The performance assessment, including monitoring performance improvement, policy analysis, in-
and evaluation activities, is guided by TDR's past formed decision-making and enhanced strategic
experience, principles outlined in international relevance of the Programme.
guidelines3 and lessons learnt from other interna-
• Harmonization within TDR and with
tional organizations (Annex 1). Guiding principles
international practices
include:
Seeking to harmonize monitoring and evaluation
practices with those of its co-sponsors and other
• Inclusiveness and transparency
international stakeholders to enhance coherence,
Engaging TDR staff and stakeholders in the devel-
collaboration and synergy.
opment of the monitoring and evaluation matrix,
as well as in the assessment of results. Sharing • Credibility and practicability
monitoring and evaluation data to enhance orga- Applying the ‘keep it simple’ concept to the moni-
nizational learning and utilization of the evidence. toring and evaluation system to ensure feasibil-
ity and credibility, and to facilitate the system's
• Usefulness implementation by stakeholders.
Promoting user performance assessment owner-
• Incremental approach
ship at each Programme level and ensuring that
Optimizing the system progressively and continu-
the system is useful to staff and stakeholders
ously while building on existing systems and good
alike. Promoting organizational learning towards
practices.
The assessment framework has a broad and To ensure consistency and coherence, the various
comprehensive scope when addressing the measurements need to be aggregated as much as
Programme's expected results, core values and possible throughout the Programme. Monitoring
management performance. These are monitored and evaluation findings at the activity level are
and evaluated at activity, team and Programme aggregated at the team level. Measurements
levels, as described below. at the team level are, in turn, aggregated at the
Programme level, as shown in Fig. 3.
4.1. Assessing performance at activity, team and
Programme levels
The framework provides a performance assessment
structure at the following levels:
3. Principles for evaluation of development assistance. Paris, OECD
• Activity level (project management and contract
Development Assistance Committee, 1991 (http://www.oecd.org/
management, including research grants) dataoecd/31/12/2755284.pdf, accessed on 4 January 2013); UNEG
ethical guidelines for evaluation. New York, The United Nations
• Team level (areas of work)
Evaluation Group, 2007 (http://www.unevaluation.org/ethicalguide-
• Programme level. lines, accessed on 4 January 2013).
TDR PERFORMANCE ASSESSMENT FRAMEWORK - PART 1 11
PROGRAMME
PERFORMANCE
PLANNING
Programme MEASUREMENT
performance
Team
performance
Activity
performance
To guide the performance assessment, the Pro- TDR’s outcomes contribute to WHO’s outcomes.
gramme’s expected results are clearly outlined. The They are reported to the World Health Assembly in
results chain (Fig. 4) presents these expected results conjunction with other WHO departments, offices
and reflects the Programme’s logic to achieving and regions that share the same objectives.
its objectives and in contributing to the broader
impact on global health.
Each team’s specific expected results are consistent Financial implementation is done by comparing the
with the overall TDR results chain and are feeding amounts spent or contractually committed versus
into TDR’s outputs and outcomes. planned cost for each output and outcome (at team
level). This information is available in quasi real-time
Technical progress is measured in relation to to project managers and, together with information
financial implementation, both at activity and team on technical implementation, helps inform decision-
levels, and against initial or revised targets (agreed making, management review and reporting.
with donors where applicable) for deliverables.
Monitoring of milestones, addressing delays and
other issues that may appear during project imple-
mentation, are part of the monitoring and report-
ing at team level (Fig. 5).
Outcome 1 Promotion and adoption of new and improved vector control methods and strategies
Outcome 2 Policies and strategies influenced by new evidence from community-based vector control
Sustainable Communities-of-Prac-
Output 2.3
tice (CoP) of researchers
Promotion and strategies influenced by new evidence about climate and environmental
Outcome 3
change impact on vector-borne diseases
Evidence on integrated
Output 4.3 Community Case Management
(iCCM) of malaria and penumonia
Monitoring activities focus on tracking progress Managerial control of the process is greater during
towards results (Fig. 6). Evaluation activities focus the implementation phase. Delivery of outputs can
on assessing relevance, impact, effectiveness, therefore be clearly attributed to the Programme.
efficiency and sustainability. Evaluation helps to However, the Programme cannot achieve expected
understand the role of various underlying factors outcomes and impacts on its own – various
in the success or failure of activities and work stakeholders and external factors contribute
areas. Although both monitoring and evaluation to their attainment. While the TDR specific
are ongoing processes from input to impact, contribution to outcomes and impacts cannot
monitoring is more relevant during implementation always be measured, it is possible to demonstrate
(from input to output), while evaluation is more the link between outputs and the desired/achieved
relevant to results and expected changes (from outcomes and impact.
output to impact). Periodic external evaluation will
provide input so that the Programme maintains
strategic relevance to global issues.
Results
Inputs Process
Outputs Outcomes Impact
Attribution Contribution
Inputs, processes and outputs are directly It is expected that TDR outputs will
attributed to TDR contribute to global benefit
Evaluation >>>
Monitoring >>>
Are we on the right track?
Are we on track?
16 PART 2 - TDR PERFORMANCE ASSESSMENT FRAMEWORK
Out of a multitude of possible indicators, TDR has Additional performance indicators, at all three
selected a limited number of relevant quantitative levels, may be developed in order to measure
and qualitative key performance indicators to help performance in a comprehensive way or highlight
measure progress and assess performance at the specific aspects that require attention. Performance
Programme level (see key performance indicators, indicators are selected at activity and team levels
Part II, Section 5). and aggregated up to the Programme level.
A range of indicators has been carefully selected Table 1 presents a consolidated list of key
to measure performance across TDR, as described performance indicators used across the
in Part I, Section 4.2. It is understood, however, Programme to measure and report on the three
that the use of indicators has limitations when main performance areas and progress made in
the objective is to express different aspects of implementing the strategy.
programme performance (see quote below).
TDR’s monitoring and evaluation matrix is
"Everything that can be counted does not presented in Annex 3. For each indicator, it
necessarily count; everything that counts presents:
cannot necessarily be counted." (i) the specific achievement target;
(ii) baseline data representing the situation before
Albert Einstein, 1879–1955
the start of activities;
(iii) the source of verification;
With the proposed indicators TDR is aiming to
(iv) who is responsible to conduct the
reflect performance aspects that are traditionally
measurement; and
hard to quantify. All the proposed indicators satisfy
(iv) when the measurement will be made.
the SMART criteria (specific, measurable, attainable,
relevant, and time bound).
TDR PERFORMANCE ASSESSMENT FRAMEWORK - PART 2 17
Enhanced research and knowledge 7. Number of DEC institutions and/or networks demonstrating
transfer capacity within disease expanded scope of activities and/or increased funding from
endemic countries alternative sources thanks to TDR support.
Key stakeholders in disease endemic 9. Number and evidence of research-related agendas, recom-
countries engaged in setting the re- mendations and practices agreed by stakeholders at global,
search agenda and ensuring research regional or country level.
reflects their needs
10. Proportion of TDR outputs produced with key DEC stakeholder
active involvement (within calendar year).
18 PART 2 - TDR PERFORMANCE ASSESSMENT FRAMEWORK
Sustainability of outcomes 17. Number of effective public health tools and strategies devel-
oped which have been in use for at least two years.
Management performance
Effective resource mobilization 19. Percentage of approved biennial budget successfully funded.
Both the TDR secretariat and stakeholders (such as grant and contract managers, advisory
committees, partners and governing bodies) carry out regular performance assessments. The
frequency of these reviews varies from monthly to yearly. Independent external evaluations
of TDR as a Programme are done once every five to seven years.
1.1. Team and activity levels conducted as required. These may be requested
Monitoring at team and activity levels by TDR managers or by advisory committees or, in
Team leaders and project managers have devel- special circumstances, by TDR's governing bodies.
oped indicators which contain a specific achieve-
ment target and timeline for measurement. These 1.2. Programme level
elements are reviewed internally at the quarterly Internal evaluation at quarterly management
management review meetings and externally by review meetings
expert advisory committees and TDR’s governing At the quarterly management review meetings,
bodies. Performance monitoring activities are con- team leaders present highlights of the progress
ducted according to the respective team’s schedule, made both on the technical side (project mile-
as shown in Fig. 5. stones) and on the financial side of projects
and activities (funds spent and obligated versus
Evaluation by Special Advisory Groups (SAGs) planned costs). Any issues encountered, as well as
Ad hoc, time-limited independent SAGs assist TDR risk mitigation measures, are discussed in the quar-
in the technical review of activities by focusing on terly management reviews. The quarterly reviews
specific areas or projects requiring additional or provide an opportunity for sharing experience and
specialized input. The need and the specific issues organizational learning.
taken on by the SAGs are proposed by TDR staff
and endorsed by STAC and may include: advice on Governing bodies oversight
strategic direction, priority setting, screening and Joint Coordinating Board – Due to its nature as a
selection of projects, recommendations for fund- United Nations co-sponsored research and training
ing, follow-up of progress and evaluation of results. programme, TDR benefits from a special gover-
nance structure. The Programme is governed by the
SAGs are proposed by the TDR Director to STAC, Joint Coordinating Board (JCB), consisting of coun-
which appoints a chair from amongst its members tries elected by the six WHO regional committees;
with the most relevant scientific and technical resource contributor countries or constituencies;
expertise. other cooperating parties and the four co-sponsor-
ing agencies. The JCB reviews the expected results,
Ad hoc contracted evaluation studies performance and relevance of the Programme
Evaluation studies to address specific issues or annually and approves the Programme's budget
questions related to work areas or activities are for each biennium. This Performance Assessment
20 PART 3 - TDR PERFORMANCE ASSESSMENT FRAMEWORK
Framework and the corresponding TDR Results UNICEF, UNDP, the World Bank and WHO, the chair
Report are used as tools to guide the JCB's review. and vice-chair of the JCB, chair of STAC, a represen-
tative of the JCB resources contributors group and a
Scientific and Technical Advisory Committee – The representative of the disease endemic countries. It
JCB and TDR Director are supported by a Scientific reviews the overall management of the Programme.
and Technical Advisory Committee (STAC) com-
prised of globally recognized experts. This commit- Processes – STAC reviews a draft version of the
tee undertakes an annual scientific and technical annual technical reports (by work area) and the
review of the Programme and advises on strategy TDR Results Report highlighting the Programme’s
directions. STAC reviews the Programme's expected performance and makes recommendations. The re-
results and performance as presented in the TDR vised documents and the draft TDR Annual Report
Results Report and in the respective annual techni- are then reviewed by the Standing Committee, with
cal reports. The present framework guides this the final reports submitted for approval to the JCB.
review. The oversight review model described in Fig. 7
provides TDR with convening power, credibility as
Standing Committee – A Standing Committee a neutral player, and access to global expertise and
consists of the four co-sponsoring agencies, namely knowledge from multiple disciplines and sectors.
Joint Coordinating
Board (JCB)
Standing
Committee
Scientific and
Technical Advisory
Committee (STAC)
TDR secretariat
Strategic Advisory
Groups (SAGs)
TDR PERFORMANCE ASSESSMENT FRAMEWORK - PART 3 21
WHO's performance assessment by the The Portfolio and Programme Management (PPM)
World Health Assembly unit is responsible for facilitating the performance
TDR contributes to two of the thirteen WHO stra- assessment process in consultation with the Direc-
tegic objectives (SOs) highlighted in the Eleventh tor’s office, TDR staff and stakeholders, including
General Programme of Work, 2006-2015 – A Global donors and partners. It fosters the utilization of
Health Agenda: (i) SO1 – to reduce the health, social monitoring and evaluation findings for continuous
and economic burden of communicable diseases; improvement through portfolio analysis, and for
and (ii) SO2 – to combat HIV/AIDS, tuberculosis and providing the basis for policy advice and decision-
malaria. TDR’s technical and financial progress making. PPM facilitates organizational learning,
towards achieving the specific expected results information management and risk management in
contributing to these two SOs is compiled in WHO's close collaboration with other relevant units.
annual Performance Assessment Report, which is
reviewed by the Executive Board and the World Team leaders and project managers are responsible
Health Assembly. for coordinating technical activities. They lead the
development and implementation of expected
WHO has developed the Twelfth General results and related activity indicators in consulta-
Programme of Work, 2014-2019. As of 2014, TDR will tion with PPM, advisory committees and major
contribute mainly to Category 1 (Communicable stakeholders within and outside of WHO. Team
Diseases), with strong linkages to the other four leaders and project managers are also responsible
categories (work on maternal and child health, for integrating systematic performance assessment
outbreaks, health systems, etc.). and risk management within the activities of the
teams.
WHO internal audits
TDR's operational, administrative and financial Stakeholders have been extensively engaged in
procedures and practices are subject to audit by the development, implementation and revision of
WHO's internal auditors, who perform ad hoc audits the Performance Assessment Framework. Resource
following the schedule and procedures established contributors provided input into the design of the
for WHO as a whole. M&E matrix and helped define and revise TDR’s
results chain. Study investigators, consultants and
institutions are under contract to manage activi-
1.3. Roles and responsibilities ties, monitor their progress and evaluate results
TDR Director provides leadership in promoting prior to independent review. Partners assist TDR in
performance assessment and supporting its use identifying collective outcomes and impact, and
in the management cycle. The Director has overall help develop means to jointly measure such indica-
responsibility for the Programme's performance. tors. External advisers such as advisory committee
members evaluate relevance, quality and achieve-
The Senior Management Group (SMG) and team ment of the activities, teams and the Programme as
leaders are engaged in the implementation and a whole.
review of the Performance Assessment Framework.
The SMG has a critical role in promoting and lead- Governing bodies, including representatives from
ing continuous performance improvement at all disease endemic countries, review the Programme’s
levels of the Programme, utilizing the monitoring expected results and performance and request
and evaluation data and contributing to organiza- periodic external reviews and ad hoc independent
tional learning. evaluations on specific issues as needed.
22 PART 3 - TDR PERFORMANCE ASSESSMENT FRAMEWORK
Implementation of the framework is an incremen- Internal and external review systems are used to
tal process starting at the Programme level, then facilitate a systematic TDR monitoring and evalua-
integrated, step-by-step, at team and activity levels. tion process. These indicators have been selected
The framework builds on systems that already ex- to reflect progress on the strategic plan 2012-2017.
ist. As the framework is being implemented at team Consideration was given to selecting a limited
and activity levels, it is optimized to facilitate its number of indicators that are sensitive enough
application and to fit the needs of the Programme. and easy to measure.
Organizational learning is critical if the process of Regular progress monitoring and performance
performance assessment is to lead to performance evaluation provide a good understanding of where
improvement. the Programme lies in achieving the expected
results. They help clarify the factors underlying
Fig. 8 shows how a monitoring and evaluation pro- these achievements, make informed decisions and
cess fits into the overall management cycle of TDR readjust the plans accordingly.
and how the related findings are utilized to learn,
share and make informed decisions at individual
and organizational levels.
24 PART 4 - TDR PERFORMANCE ASSESSMENT FRAMEWORK
Planning
Making informed
decisions
Implementing
PERFORMANCE
IMPROVEMENT
(review meetings
and lunch seminars)
Organizational
Monitoring &
learning
Evaluation
Reporting
Described below are various opportunities at TDR reviewed. Progress on expected results (outputs
to discuss collectively the monitoring and evalua- and outcomes) is assessed. The indicators present-
tion findings. ed in the framework are reviewed and milestones
highlighted. The review allows for reflection and
Monthly staff meetings provide a good opportunity discussion on past experiences. Risk management
for updates and sharing experiences. actions are being followed up on and additional
measures identified as needed.
Bi-weekly team leaders meetings discuss progress
made and any issues encountered that need special The governance structure and review processes
attention. The meetings are also opportunities to through the advisory committees greatly facilitate
review new processes, systems and policies ahead performance improvement. Recommendations are
of those being implemented at Programme level. carefully analysed and addressed.
3. Main challenges
Performance assessment and related monitoring of international norms, standards and guidelines.
and evaluation activities are recognized as criti- In its efforts to optimize performance assessment,
cal elements in global health initiatives and in the TDR is seeking to harmonize with international
development sector. They give programmes the practices and engage with stakeholders.
chance to highlight their results and their contri-
bution towards global health, to ensure strategic Engagement of TDR's management, leadership and
relevance and to identify what does and does not staff in the performance assessment process has
work. However, measuring the specific outcomes been critical for its success. Expanding the focus
and impact of a single programme is challenging, to outcomes and impact required a major culture
as improvements made in global health are often change within TDR, but it is now facilitating the
synergistic among stakeholders and seldom implementation of the new strategy 2012-2017.
achieved by a single programme.
This section provides the definition of common Equity – Absence of avoidable or remediable
terms adopted by TDR. The monitoring and differences among groups of people, whether
evaluation terms used in this document are aligned those groups are defined socially, economically,
with those adopted by TDR co-sponsors and other demographically or geographically.
international organizations4.
Evaluation – The systematic and objective
Accountability – Obligation towards beneficiaries, assessment of the relevance, effectiveness,
resource contributors and other stakeholders, to efficiency, impact and sustainability of an ongoing
demonstrate that activities have been conducted or completed activity, a team, a policy or the
in compliance with agreed rules and standards and Programme. Evaluation can also address specific
to report fairly and accurately on the achievement issues and answer specific questions to guide
of objectives vis-à-vis mandated roles and/or plans. decision-makers and managers and to provide
It involves taking into account the needs, concerns, information on the underlying factors influencing
capacities and disposition of affected parties, and a change.
explaining the meaning of, and reasons for, actions
and decisions. Expected results - Expected results are outputs,
outcomes and/or impact that TDR intends to
Activity – A set of interrelated actions necessary
produce through its portfolio of activities.
to deliver specific outputs towards achieving the
objectives. In TDR, the activity level encompasses Impact – Positive or negative, primary or secondary
all actions under a team, including contracting for long-term change produced by an activity or a
research grants and services. set of activities directly or indirectly, intended or
unintended. It is the ultimate change in public
Attribution – The direct causal link between ob-
health to which outcomes are linked or contribute.
served (or expected) changes and a specific activity.
Indicator – See performance indicator.
Baseline data – Indicator data that describes the
situation at the beginning of the TDR strategy Input – Financial, human and material resources
implementation, against which progress can be used for activities.
assessed or comparisons made. Baselines may not
Key performance indicator – Performance
be available when measurements are complex and
indicator that is shared across the Programme and
expensive. In such cases the first measurement to
can be aggregated from the activity level to the
be carried out through this framework will serve as
team and from the team level to the Programme
the baseline level.
level.
Contribution – The indirect causal link between
observed (or expected) changes and a specific Milestone – Performance indicator related to
activity or set of activities. It is implied that the processes and used to track progress towards
change cannot be produced by the activity or set of achievements of outputs. Milestones are key
activities specific to the Programme alone but will events, achievements or decisions in workplans.
be achieved through the output of the Programme They map out the main steps of the workplan
combined with outputs resulting from the activi- implementation.
ties of partners and other players. Monitoring – A continuing function that
Disease endemic country (DEC) – A low-, middle- aims primarily to provide managers and main
income5 or least developed6 country in which stakeholders with regular feedback and early
infectious diseases (whether endemic or epidemic) indications of progress or lack thereof in the
contribute to the overall burden of disease7 or achievement of intended results. Monitoring
mortality and/or a major public health problem. tracks the actual performance or situation against
TDR PERFORMANCE ASSESSMENT FRAMEWORK 27
what was planned or expected according Results chain – Causal sequence of the expected
to pre-determined standards. Monitoring results to achieve objectives and contribute to
generally involves collecting and analysing the broader impact. The TDR results chain reflects
data on specified performance indicators and the causal sequence of the programme's expected
recommending corrective measures. results to achieve the Programme’s objectives.
Related documents
Organization for Economic Co-operation and United Nations Children's Fund (UNICEF) - Programme
Development (OECD) - Development Assistance policy and procedures manual: programme opera-
Committee, Working Party on Aid Evaluation – tions, chapter 6. Programming tool, section 06. Inte-
Glossary of Key Terms in Evaluation and Results Based grated Monitoring and evaluation plan (http://www.
Management, 2002 (http://www.oecd.org/findDocum ceecis.org/remf/Service3/unicef_eng/module2/part1.
ent/0,2350,en_2649_34435_1_119678_1_1_1,00.html, html, accessed on 14 May 2013).
accessed on 14 May 2013).
United Nations Children's Fund (UNICEF) - Understand-
Organization for Economic Co-operation and Develop- ing Results Based Programme Planning and Manage-
ment (OECD) – Development Assistance Committee, ment, Evaluation Office and Division of Policy Plan-
Principles for Evaluation of Development Assistance, ning, 2003 (http://www.unicef.org/evaluation/files/
1991, reprinted in 2008 (http://www.oecd.org/datao- RBM_Guide_20September2003.pdf, accessed on 14
ecd/31/12/2755284.pdf, accessed on 14 May 2013). May 2013).
Organization for Economic Co-operation and Results-Based Management in the United Nations
Development (OECD) – Development Assistance Development System: Progress and Challenges (http://
Committee, Evaluation Quality Standards, 2006 www.un.org/esa/coordination/pdf/rbm_report_10_
(http://www.oecd.org/document/30/0,3343, july.pdf , accessed on 14 May 2013).
en_21571361_34047972_38903582_1_1_1_1,00.html, ac-
The World Bank – Ten Steps to Results-Based Monitor-
cessed on 14 May 2013).
ing and Evaluation System, A Handbook for Develop-
United Nations Development Programme (UNDP) – ment Practitioners; Jody Zall Kusek, Ray C. Rist, 2004.
Handbook on Monitoring and Evaluation for Results,
The World Bank – Annual Report on Operations Evalu-
2002 updated in 2009.
ation - Appendix A: Overview of Monitoring and Evalu-
United Nations Development Programme (UNDP) – ation in the World Bank, 2006 (http://web.worldbank.
RBM in UNDP: Selecting Indicators, 2002 (http://www. org/WBSITE/EXTERNAL/EXTOED/EXTANNREPOPEEVA
undp.org/cpr/iasc/content/docs/MandE/UNDP_RBM_ L/0,,contentMDK:21123422~menuPK:4635630~pagePK
Selecting_indicators.pdf, accessed on 14 May 2013). :64829573~piPK:64829550~theSitePK:4425661,00.html,
accessed on 14 May 2013).
UNEG, Ethical Guidelines for Evaluation (2007) (http://
www.unevaluation.org/ethicalguidelines, accessed on The World Bank – Sourcebook on Emerging Good
14 May 2013). Practices - Emerging Good Practice in Managing For
Development Results, 2006 (http://www.mfdr.org/
United Nations Evaluation Group (UNEG) - Principles of
Sourcebook.html, accessed on 14 May 2013).
Working Together, 2007 updated in 2012 (http://www.
unevaluation.org/papersandpubs/documentdetail. World Health Organization (WHO) – 11th General
jsp?doc_id=96, accessed on 14 May 2013. Programme of Work 2006-2015 (http://whqlibdoc.who.
int/publications/2006/GPW_eng.pdf, accessed on 14
United Nations Evaluation Group (UNEG) – Standards
May 2013).
for Evaluation in the UN System, 2005, updated in 2012
(http://www.uneval.org/normsandstandards/index.
jsp?doc_cat_source_id=4, accessed on 14 May 2013).
Annexes
Annex 1 – Engaging stakeholders in the Development of this framework
Annex 2 – Reporting
Annex 3 – TDR Monitoring and Evaluation Matrix
30 ANNEX
PART
1 - TDR
3 - TDR
PERFORMANCE
PERFORMANCE
ASSESSMENT
ASSESSMENT
FRAMEWORK
FRAMEWORK
The development of the initial TDR Performance Switzerland; International Development Research
Assessment Framework was conducted through a Centre, Canada; Academy for Educational
collective effort led by Drs Beatrice Halpaap and Development, USA; Department for International
Fabio Zicker involving TDR staff and stakeholders. Development, UK; and the Swedish International
Internal and external consultations helped to Development Cooperation Agency, Sweden.
develop ownership, capture the perspectives of • The Secretariat for the Global Strategy and Plan
various stakeholders and enhance harmonization of Action for Innovation, Public Health and
with international practices. Intellectual Property.
• World Intellectual Property Organization.
A small internal working group representing TDR's
strategic functions was established in order to
An external advisory group with representation
assist in the development of an initial draft and
from research and training funding programmes,
subsequent revisions. This group was supported by
development agencies, research institutions
four additional internal groups to help develop key
in disease endemic countries and individual
performance indicators which are used to measure
researchers, met in December 2009 to review
and reflect TDR's performance. The groups worked
the TDR Performance Assessment Framework
in consultation with the following stakeholders:
and made recommendations to TDR's Director.
• WHO research programmes, including the The external advisory group was composed of the
Initiative for Vaccine Research, Research Policy following individuals:
and Cooperation Department; WHO Ethics,
• Dr Alejandro CRAVIOTO, Executive Director,
Equity, Trade and Human Rights Department;
International Centre for Diarrhoeal Disease
and the Special Programme of Research,
Research (ICDDR,B), Dhaka, Bangladesh.
Development and Research Training in Human
Reproduction, co sponsored by UNDP, UNFPA, • Professor Alan FAIRLAMB, Professor and Head,
UNICEF, WHO and the International Bank for Division of Biological Chemistry and Drug
Reconstruction and Development. Discovery, School of Life Sciences, Wellcome
Trust Biocentre, University of Dundee, Dundee,
• WHO regional offices for Africa, the Americas,
UK.
the Eastern Mediterranean, Europe, South-East
Asia and the Western Pacific. • Dr Linda KUPFER, Acting Director Division
of International Science Policy, Planning &
• TDR co-sponsors' evaluation and/or policy
Evaluation, NIH/Fogarty International Centre,
offices: UNICEF, UNDP (Global Environment
Bethesda, USA.
Facilities) and the World Bank.
• Professor Mary Ann D LANSANG (Chair),
• Research institutions including the International
University of the Philippines, Manila, Philippines;
Centre for Medical Research (CIDEIM),
seconded as Director, Knowledge Management
Colombia; the Trypanosomiasis Research Center,
Unit, Global Fund to Fight AIDS, Tuberculosis
Kenya; International Centre for Diarrhoeal
and Malaria, Geneva, Switzerland.
Disease Research (ICDDR,B), Bangladesh;
Fundação Oswaldo Cruz (FIOCRUZ), Brazil; and • Ms Jo MULLIGAN, Health Advisor, Department
University of Dundee, UK. for International Development, London, UK
• Research funding institutions and development • Dr Zenda OFIR (Rapporteur), Evaluation
agencies, including the Wellcome Trust, UK; Specialist, Johannesburg , South Africa.
Fogarty International Center, USA; National • Dr Claude PIRMEZ, Vice-President of Research
Research Foundation, South Africa; the Global and Reference Laboratories, Fundação Oswaldo
Fund to Fight AIDS, Tuberculosis and Malaria, Cruz (FIOCRUZ), Rio de Janeiro, Brazil.
TDR PERFORMANCE ASSESSMENT FRAMEWORK - ANNEX
PART 3 1 31
ANNEX 2 Reporting
Types of report Scope Frequency Target Audience
Target Frequency of
Expected results Key performance indicators Source of data
(2017) measurement
OUTCOME: 1. Number and proportion of new/improved solutions, 30 Publications, annual reports, Measured annually,
Infectious disease implementation strategies or innovative knowledge ≥ 75% interviews, surveys cumulative over 6 years
knowledge, solutions successfully applied in developing countries.
and implementation
strategies translated
2. Number of tools and reports that have been used to inform
into policy and practice
policy and/or practice of global/regional stakeholders or Publications, annual reports, Measured annually,
in disease endemic 7
major funding agencies. interviews, surveys cumulative over 6 years
countries
TDR PERFORMANCE ASSESSMENT FRAMEWORK - ANNEX
PART 3 3
Target Frequency of
Expected results Key performance indicators Source of data
(2017) measurement
12. Proportion of experts from DECs on TDR advisory WHO financial data,
PERFORMANCE
Gender: 13. Proportion of women among grantees/contract recipients WHO financial data,
50% Measured annually
ASSESSMENT
17. Number of effective public health tools and strategies Measured annually, two
Sustainability of developed which have been in use for at least two years. 67 Annual reports, publications
outcomes years after adoption
18. Proportion of project final reports found satisfactory Committee meeting minutes
Quality of work by peer- review committees. >80% Measured annually
and recommendations
Management performance
19. Percentage of approved biennial budget successfully TDR JCB-approved budget, Measured in the second
Effective resource funded. ≥100%
WHO financial data year of each biennium
mobilization
20. Percentage of income received from multi-year WHO financial data, TDR Measured in the second
tbd
agreements. agreements year of each biennium
TDR PERFORMANCE ASSESSMENT FRAMEWORK - ANNEX
PART 3 3
23. Proportion of significant risk management action plans Quarterly reviews, risk moni-
≥80% Measured annually
that are on track. toring tool
35
36 PART 3 - TDR PERFORMANCE ASSESSMENT FRAMEWORK
TDR/STRA/14.2
World Bank