ITC Evaluation Guidelines: Second Edition
ITC Evaluation Guidelines: Second Edition
ITC Evaluation Guidelines: Second Edition
FOR GOOD
ITC Evaluation
Guidelines
SECOND EDITION
The designations employed and the presentation of
material in this publication do not imply the expression
of any opinion whatsoever on the part of the International
Trade Centre concerning the legal status of any country,
territory, city or area or of its authorities, or concerning the
delimitation of its frontiers or boundaries.
June 2018
Original: English
Contents
Acronyms …. ...........................................................................................................................................iv
1. Introduction ................................................................................................................................... 1
Overview ....................................................................................................................................... 1
How evaluation is approached at ITC ........................................................................................... 2
Building Staff Capacity ....................................................................................................... 2
Annual Evaluation Work Programme ................................................................................. 2
Annual Evaluation Synthesis Report .................................................................................. 2
2. Planning for Monitoring and Evaluation ........................................................................................ 3
Project cycle management and evaluation ................................................................................... 3
Monitoring Plan ............................................................................................................................. 3
Baseline data ...................................................................................................................... 4
Complementary indicators .................................................................................................. 4
Monitoring methods ............................................................................................................ 5
Evaluation Plan ............................................................................................................................. 5
Prospective purpose and use of the evaluation ................................................................. 6
Types of evaluation at ITC.................................................................................................. 6
Preferences in terms of who will manage the evaluation ................................................... 7
A proportionally costed budget ........................................................................................... 7
Prospective date for the start of the evaluation .................................................................. 7
3. ITC Results Framework and Evaluation Approach ....................................................................... 8
ITC results framework ........................................................................................................ 8
ITC evaluation approach .................................................................................................. 10
4. Evaluation Process ..................................................................................................................... 13
Pre-evaluation discussion and informal interaction .................................................................... 13
Terms of reference...................................................................................................................... 14
Background....................................................................................................................... 15
Anticipated utility and scope ............................................................................................. 15
Evaluation approach ......................................................................................................... 15
Evaluation criteria and questions...................................................................................... 15
Evaluation management ................................................................................................... 17
Evaluation use .................................................................................................................. 19
Preparation for Inception Report ................................................................................................. 19
Desk review ...................................................................................................................... 19
Using the theory of change ............................................................................................... 20
Inception Report.......................................................................................................................... 20
Introduction ....................................................................................................................... 21
Evaluation framework ....................................................................................................... 23
Evaluation methodology ................................................................................................... 23
Workplan........................................................................................................................... 31
Logistics ............................................................................................................................ 31
Appendices ....................................................................................................................... 31
i
ITC EVALUATION GUIDELINES
ii
ITC EVALAUTION GUIDELINES
iii
ITC EVALUATION GUIDELINES
Acronyms
Note: Unless otherwise specified, all reference to dollars ($) are to United States dollars.
CRC Committee on the Rights of the Child/Convention on the Rights of the Child
GE Gender Equality
HR Human Rights
UN United Nations
iv
ITC EVALUATION GUIDELINES
1. Introduction
Overview
1. These Evaluation Guidelines aim to build a common organizational understanding of the
methodology, process and quality standards of evaluations, to ensure a level of coherence across
the different types of evaluation at the International Trade Centre (ITC). They codify and harmonize
the methods, tools, processes, criteria and rating system used for evaluation in ITC. The Guidelines
also address the need to ensure thematic harmony and coordination among evaluations managed
by the ITC Independent Evaluation Unit (IEU), and Funder-led Evaluations. In brief, the Guidelines:
3. The ITC Evaluation Policy contains the general principles, standards and processes governing the
evaluation function of ITC. The Policy provides guidance on the scope, practice and use of
valuation to serve management decisions and policymaking, and the need for coordination with
funders on Funder-led Evaluations related to ITC operations. Building on the Policy, the Guidelines
contribute towards embedding evaluation within ITC. One of the expected effects of a harmonized
quality approach to evaluation is to promote an evaluative culture in the organization; the role of
evaluation is to facilitate the development of evidence-seeking behaviour to support evidence-
based decision-making. Characteristics of an organization with a strong evaluative culture include:
1
International Trade Centre (2015). ITC Evaluation Policy, Second Edition, Geneva. Available from:
http://www.intracen.org/uploadedFiles/intracenorg/Content/About_ITC/How_ITC_Works/Evaluation/ITC-Evaluation-Policy-
2015-Final.pdf
2
Adapted from: Mayne, John (2010). Building an Evaluative Culture: The Key to Effective Evaluation and Results Management.
The Canadian Journal of Program Evaluation, Vol. 24, No. 2, pp. 1-30. Available from: http://evaluationcanada.ca/secure/24-2-
001.pdf
1
ITC EVALUATION GUIDELINES
4. In line with good practice promoted by the United Nations (UN) Office of Internal Oversight Services
(OIOS), and the UN Evaluation Group (UNEG), ITC has developed an evaluation function. The
evaluation function serves ITC management decision-making on selected policy and strategic
areas, with the purpose of improving the performance and results towards achieving the UN
Sustainable Development Goals (SDGs), and enhancing ITC’s position in the trade and
development arena. The evaluation function is distinct from, but strategically complements, ITC’s
Strategic Planning, Performance and Governance (SPPG) function which oversees planning,
monitoring and reporting. The IEU is the custodian of the evaluation function; its role is to support
corporate learning and accountability.
5. The scope of the Guidelines covers all types of evaluations: Independent Evaluations, Self-
Evaluations, PCR, evaluations undertaken by funders, and reviews. The ITC results framework 3,
which determines ITC’s impact objectives, operational model of the organization, and its
intervention to achieve impact, is central to the approach of these Guidelines. They will be
implemented incrementally in line with ITC’s Annual Evaluation Work Programme (AEWP). In
specific cases, use of the Guidelines should be adaptable and flexible; insistence on
methodological rigor should not be made at the cost of pertinence or utility of an evaluation
exercise.
8. The AEWP is prepared through a priority-setting process and includes all the types of evaluations.
The decision whether projects that are subject to mandatory evaluation undertake an Independent
Evaluation or Self-Evaluation is based on a risk and opportunities assessment. In principle,
Independent Evaluations should concentrate on items obtaining high scores against risk factors
such as project complexity, funding source, innovation and replication and strategic partnerships
(see Annex I).
3
The ITC results framework is discussed further in Chapter 3.
4
See ITC E-Learning: http://www.intracen.org/itc/market-info-tools/e-learning/
5
For example, the 2018-2019 ITC Proposed programme budget for the biennium can be found at:
http://www.intracen.org/itc/about/working-with-itc/corporate-documents/financial-reports/
6
International Trade Centre (2017). Operational Plan 2017, Geneva. Available from:
http://www.intracen.org/uploadedFiles/intracen.org/Content/About_ITC/Corporate_Documents/_Operational%20Plan%202017_
combined_web.pdf
2
ITC EVALUATION GUIDELINES
recommendations to senior management. 7 The AESR also includes a report on the status of the
implementation of past evaluation recommendations that are still ongoing. The AESR is presented
to ITC management, staff, and the Joint Advisory Group (JAG). 8
Monitoring Plan
11. At the project level, the intelligence acquired through monitoring enables managers to update and
adjust their understanding of the required preconditions for success and the intervention strategy.
During implementation, as monitoring data becomes available, project management should
periodically refine the project’s Theory of Change (ToC) based on evidence.
12. Within ITC, the Monitoring Plan is referred to as the Results Monitoring Plan 10. The purpose of the
Monitoring Plan is to address the need for effective management and accountability, and to
7
See past Annual Evaluation Synthesis Reports at: http://www.intracen.org/itc/about/how-itc-works/evaluation-publications-and-
synthesis/
8
See documentation presented at previous JAG meetings at: http://www.intracen.org/itc/events/JAG
9
For further information see www.uneval.org
10
See the Results tab in projects found in ITC’s New Projects Portal.
3
ITC EVALUATION GUIDELINES
facilitate effective reporting and evaluation. Monitoring results inform and provide a basis for
reporting and evaluation; an analysis of the effectiveness of a Monitoring Plan is an integral part of
an evaluation. It is critical for a Project Manager to track and gauge project implementation
progress, modify activities according to emerging situations, and keep implementation on track
towards achieving stated objectives.
13. A Monitoring Plan is a fundamental responsibility in project management and can include
monitoring, reporting, and certain Self-Evaluation activities. The Monitoring Plan should use
progress and results indicators aligned to the project’s ToC, and indicate the timeline for monitoring
and reporting deliverables, such as a baseline report, periodical progress reports, midterm
evaluation, PCR, and any other research. Monitoring tools include: project indicators, programme
indicators, corporate indicators, means of verification, data collection frequency, baseline figures
and total target figures (including annual and overall figures). Including research data is also
encouraged in a monitoring plan. In-house resources for research information can be derived from
sources such as ITC’s benchmarking tool for trade and investment support institutions (TISI) 11 or
impact data from ITC’s annual SME Competitiveness Outlook 12.
14. External and internal factors are also considered at the project design stage and are monitored
throughout project implementation. In terms of external factors (those outside of the control of the
Project Manager, such as political, climatic and security), risks and assumptions are identified in
the project logical framework, and a risk management plan is also established. The Plan includes
the probability of the risk occurring (low, medium, or high), impact on project results (low, medium,
or high), risk reduction measures; additional resources/activities needed, and the person(s)
responsible. These elements are applied to each level of the project logical framework. A good
Monitoring Plan takes into account internal factors (those within the direct and indirect control of
the Project Manager, such as verifying the state of readiness of beneficiaries and partners,
ensuring the use of outputs, following-up on procurement, and supporting partners and
beneficiaries to achieve outcomes).
Baseline data
15. Baseline data is information that measures conditions (appropriate indicators) prior to the start of
a project to be used for later comparison. Baseline data provides an historical point of reference
to inform project planning such as target setting, to monitor change during project implementation,
and to evaluate change for impact. Baseline data can be obtained through sources such as a
baseline study, target figures reached in previous projects, data determined in a needs
assessment, and national statistics.
Complementary indicators
16. In addition to ITC corporate indicators 13, monitoring could comprise complementary indicators to
measure issues and progress related to project relevance, efficiency, impact, and sustainability 14;
or to assess the performance of specific key components in the results chain 15. Complementary
indicators could also include indicators developed by stakeholders. 16
11
ITC’s benchmarking tool is available at: http://www.tisibenchmarking.org/benchmarkredesign/
12
ITC’s annual SME Competitiveness Outlook is available at: http://www.intracen.org/SMEOutlook/
13
See full presentation of corporate indicators in the Project Management Guidelines: https://our-intranet.itc-
cci.net/oed/sppg/ProjectManagement/SitePages/Corporate%20results%20indicators%20and%20results%20toolkit.aspx
14
Some examples: Relevance ― Percentage of partners and beneficiaries buy-in (their opinion whether the intervention meets
their needs and priorities); Efficiency ― Cost against budget and areas where overruns or underspending occur; Impact ―
Percentage of SMEs in the target population having transacted international business; Sustainability ― Progress against the plan
in negotiating an exit strategy with key partners.
15
According to the ITC Theory of Change approach, monitoring could be interested in measuring progress in the target population
in terms of increased knowledge, skills and exchange, and improved consensus and feasible plans for action; and, in assessing
progress / achievement in the fourth intermediate outcome related to external parties (see Figure 2).
16
Such as the Donor Committee for Enterprise Development (DCED) Standard for Results Measurement, http://www.enterprise-
development.org/measuring-results-the-dced-standard/
4
ITC EVALUATION GUIDELINES
Monitoring methods
17. The Monitoring Plan includes monitoring methods that are used to address how and why the pre-
conditions for success are met to achieve project intermediary outcomes, outcomes and impact.
Table 1 below contains details related to monitoring methods, examples, and timing.
Observing and tracking This is usually achieved through workshops engaging with At the onset and at
progress in the causality partners and beneficiaries to develop and/or assess midterm stages of the
and contribution of the progress in the implementation of the intervention strategy different interventions
intervention to change and adjustments in the ToC related to an evolving context
Observing, tracking and Periodic assessment on a limited number of observations; For example, three
illustrating intervention for example, about participants’ valuation of group training months after training
implementation and its and their use of acquired capacity.
effects (outputs)
Observing and tracking This will support the plausible assessment of the On a yearly basis
the progress of the role intervention contribution to the achievement of
that partners and intermediate outcomes, outcome, and impact. It will use
beneficiaries play in the methods such as focus groups involving a limited number
achievement of of key partners and stakeholders to monitor the progress
intermediate outcomes, of partners and beneficiaries in achieving, intermediary
outcome, and impact outcomes, outcome, and impact
Identifying specific Stories are selected based on their significance. This will On a yearly basis
stories of change to be achieved through conducting periodic direct interviews
illustrate the intervention focusing on a few selected topics
effects in areas that are
otherwise difficult to
measure by quantitative
means
Evaluation Plan
18. At ITC, all TRTA projects are expected to undergo some form of evaluation and public disclosure.
Projects with a total budget over $2 million are subject to mandatory evaluation, which may take
the form of a Self-Evaluation or an Independent Evaluation or a Funder-led Evaluation. At the
Evaluation Plan stage, the form the evaluation will take is undefined; the decision is taken when
IEU is establishing its AEWP for the coming year. Projects subject to mandatory evaluation, which
do not undertake an Independent Evaluation or a Funder-led Evaluation, conduct a Self-Evaluation.
Projects with a total budget less than $2 million can be subject to a Funder-led Evaluation. If this is
not the case, they can choose to do an optional evaluation, which normally takes the form of a Self-
Evaluation. All projects are subject to a PCR (see the template in Annex VI), which is a form of
Self-Evaluation and is conducted by the responsible operational unit at the close of the project.
When a rapid and independent analysis of a specific project or another type of undertaking is
required, ITC senior management can directly commission a review. A review is an ad hoc, often
rapid assessment of the performance of an undertaking. 17
17
A review is directly managed by IEU. It is a flexible tool that uses evaluation methods although it is not bound to applying the
due process of an evaluation, in particular diffusion and follow-up.
5
ITC EVALUATION GUIDELINES
19. An Evaluation Plan is a management tool used to address the evaluation expectations of key
stakeholders (ITC management, funders, clients, partners), and to arrange resources, including a
commensurate budget for conducting the planned evaluation(s). The Evaluation Plan is prepared
prior to project approval. Project designers/managers take advantage of discussing the Evaluation
Plan with the IEU. Under normal circumstances, a final evaluation should start after the completion
of project intervention to allow for an assessment of results. Concerning midterm evaluation, they
are undertaken half-way through the intervention life-cycle.
20. The responsibility for conducting quality evaluations is shared by the evaluation practitioners and
Project Managers who monitor project results from the design stage throughout the implementation
process until completion of the project. 18 The suggested outline for a project Evaluation Plan is
offered in Box 1 below. 19
4. Prospective date for the start of the evaluation (for final evaluations, ideally well after all project
activities have been completed)
Project Completion Report (PCR). A PCR is a standardized report to assess and learn from
the performance of an intervention by those responsible for the design and delivery of the
project. A PCR is a form of Self-Evaluation.
18
In some cases, it is even necessary to monitor results beyond the end of project implementation to allow for a maturity period
to reveal impact.
19
As of May 2018, Project Managers can complete the Evaluation Plan in the projects portal.
20
Definitions for Independent Evaluation and Self-Evaluation have been adapted from the Organisation for Economic Co-
operation and Development (2010). Glossary of Key Terms in Evaluation and Results Based Management, Paris. Available
from: http://www.oecd.org/dac/evaluation/publicationsanddocuments.htm
6
ITC EVALUATION GUIDELINES
should ensure the IEU is in contact with the funder to ensure harmonization of evaluation
approach and methods. In addition, these Guidelines should be shared with the funder.
Independent Funder-led
Characteristics Self-Evaluation PCR
Evaluation Evaluation
No
(except in On request of
< $2 million Optional Mandatory
extraordinary funder
cases)
Not subject to
Mandatory evaluation
additional
(either an Independent Evaluation, when
mandatory
> $2 million project incorporated as such in IEU Mandatory
evaluation
AEWP or a Self-Evaluation in other
(except in
cases)
extraordinary cases)
Evaluation Plan,
including budget,
prepared in the Yes Yes No Yes
project design
phase
Incorporated in the
Yes Yes Yes Yes
IEU AEWP
IEU follow-up on No
recommendations Yes No No (except in
implementation extraordinary cases)
Integration of
learnings into the Yes Yes Yes Yes
AESR
7
ITC EVALUATION GUIDELINES
27. Within the UN system of assistance to developing countries, ITC is the focal point for technical
assistance and cooperation activities in trade promotion and export development. It focuses on the
implementation of international development goals ― particularly those set out in the 2030 Agenda
for Sustainable Development. 22 23
28. Building on the ITC corporate logical model, 24 Figure 2 below is a graphic illustration of ITC’s results
framework, which organizes the different components of the results chain; and where the role of
ITC is understood as a change facilitator that supports its partners and stakeholders to realize their
development objectives and contribute towards the SDGs. It is important to note that ITC has two
guidance documents which should be made use of at the project design stage, these are
Environmental Mainstreaming: A Guide for Project Managers 25; and ITC Gender Mainstreaming
Policy 26. The use of these two mainstreaming documents will guide projects in these two SDG
dimensions.
29. With the use of the results framework, ITC interventions provide partners and beneficiaries with
capacity-building services to increase their ability and likelihood to act, largely through four types
of TRTA outputs: group training, advisory services, technical material, and publications. These
outputs lead towards strengthened capacities, including increased awareness and interest;
increased knowledge, skills and exchange; and improved consensus and feasible plans to act (A1).
30. ITC’s contribution is realized though achieving four intermediate outcomes related to policymakers
and regulators, Trade Investment and Support Institutions (TISIs), and SMEs (A2, B1, C1 and C2);
these, in turn, contribute to the corporate outcome of improved international competitiveness of
SMEs in developing countries and transition economies for inclusive and sustainable development
(C3 and C4). Finally, it is anticipated that ITC’s corporate outcome of enhanced inclusive and
sustainable growth and development in developing countries, especially LDCs, and countries with
economies in transition through trade and international business development, will contribute
towards the SDGs.
21
United Nations (2016). Proposed strategic framework for the period 2018-2019, Part two: biennial programme plan,
Programme 10, Trade and Development, New York, A/71/6 (Prog. 10). Available from:
http://unctad.org/meetings/en/SessionalDocuments/a71d6prog10_en.pdf
22
Details of the SDG targets ITC contributes towards can be found on the ITC website at:
http://www.intracen.org/itc/goals/Global-Goals-for-Sustainable-Development/
23
It is worth noting that ITC is also shaped by the objectives of the Programme of Action for the Least Developed Countries for
the Decade 2011-2020; the resolutions of the High Level Fora on Aid Effectiveness; the Addis Ababa Agenda for Action on
Financing for Development; and the 21st Conference of the Parties to the United Nations Framework Convention on Climate
Change. As well, ITC plays an important role in furthering the implementation of WTO ministerial declarations, United Nations
(2016), op. cit.
24
Ibid.: 18-19; and, United Nations (2015). Proposed programme budget for the biennium 2016-2017, Part IV, International
cooperation for development, Section 13, International Trade Centre, New York, A/70/6 (Sect. 13)/Add.1., pp. 12-14. Available
from:
http://www.intracen.org/uploadedFiles/intracenorg/Content/About_ITC/Corporate_Documents/Financial_reports/A_70_6%20(S
ect.%2013)Add.1.English.pdf
25
Found on the ITC website at http://www.intracen.org/publication/Environmental-Mainstreaming-A-Guide-for-Project-Managers/
26
Found on the ITC website at http://www.intracen.org/itc/women-and-trade/programme/
8
ITC EVALUATION GUIDELINES
Capacity-building outputs:
Group Advisory Technical
Objective To support Publications
trainings services material
change in partners
conditions and actions
27
A1, A2, B1, C1 through C4 are all ITC corporate indicators, found in the corporate logic model. Based on the ITC 2018-2019
Strategic Framework.
9
ITC EVALUATION GUIDELINES
31. Information related to the baseline data of corporate indicators (A1, A2, B1, C1, C2, C3 and C4)
are found in the corporate logic model based on the performance of previous bienniums. This
information is used to establish the target figures for the upcoming biennium. Monitoring and
reporting of progress made on the corporate indicators are set out in operational plans and annual
reports.
32. External factors, which may have an influence on the successful accomplishment of ITC objectives,
include the following assumptions:
a) The international community and other stakeholders remain fully engaged and committed
to working with ITC;
c) The political capacity and geographical conditions in recipient countries remain stable for
the implementation of programme activities;
d) The mandates of the UN and other international organizations that impact ITC remain the
same; and,
e) The enabling environment in the form of fiscal and monetary policies and other measures,
including physical infrastructure in recipient countries, does not deteriorate.
33. In principle, the ITC project results chain and ToC is aligned with the ITC results framework,
particularly since ITC has created a ToC for each of its corporate programmes. 28
35. Results chains provide the building blocks for developing theories of change. With the use of
assumptions, risks, external and internal factors, theories of change expand on results chains to
explain why and how a set of results is expected to occur. While logical frameworks tend to focus
on results intended (the long-cycle logic of the logical framework), theories of change focus on the
connections between the boxes (which can be thought of as the short-cycle logic), as seen in Figure
3 below. In other words, theories of change explain how a project is expected to bring about the
desired results, rather than just describe the successive results. 30
36. It is anticipated that the causal change between each of the components, which rests outside of
the results chain, is where the ToC takes place. In other words, to successfully deliver to our
beneficiaries and partners, the change that occurs between the levels of activities, outputs,
intermediate outcomes, corporate outcomes, and impact, represents the value-added by ITC to its
partners and beneficiaries. Notwithstanding, this desired change is subject to external and internal
factors that all have an influence on the successful accomplishment of the results framework, as
identified at the Monitoring Plan stage.
28
See: International Trade Centre (2016). Report of the 50th Session of the ITC Joint Advisory Group Meeting, Geneva.
Available at:
http://www.intracen.org/uploadedFiles/intracenorg/Content/About_ITC/Working_with_ITC/JAG/Redesign/_en_2016%20JAG%2
0Report_EN.pdf
29
Adapted from IFAD (2016). Evaluation Manual, Independent Office of Evaluation of IFAD, Rome.
30
Adapted from: Treasury Board of Canada (2012). Theory-Based Approaches to Evaluation: Concepts and Practices, Ottawa,
page 5. Available from: https://www.canada.ca/en/treasury-board-secretariat/services/audit-evaluation/centre-excellence-
evaluation/theory-based-approaches-evaluation-concepts-practices.html
10
ITC EVALUATION GUIDELINES
RESULTS CHAIN
Impact
Outcome
Outputs
Activities
37. Like the operations of many other development agencies, ITC interventions tend to have a stronger
influence on project outputs and decreasing influence on intermediate outcomes and long-term
impact. Figure 4 below illustrates how change can be influenced, but not directly determined; there
are often many other contributors to, and hindrances against, the same desired impact, as changes
are taking place within an interconnected web of relationships and systems. Moreover, partners
and beneficiaries may develop a wide range of activities when achieving the intermediate
outcomes, because of a similar capacity-building intervention implemented by ITC. Therefore,
when examining performance in delivering change to partners, evaluation recognizes that changes
are complex, non-linear and multidirectional and that performance for each intervention, requires
flexibility/ability to innovate. 31
31
Adapted from: Chipimbi, Robert and Simon Hearn (2009). Outcome Mapping, Bringing learning into development
programmes, Presentation, 15-18 September, Cape Town South Africa.
11
ITC EVALUATION GUIDELINES
Intermediate Outcome
Inputs Activities Outputs
outcomes and impact
38. With this caution in mind, since ITC is accountable to achieve the ITC results framework, evaluation
examines whether and how partners and beneficiaries have improved their conditions and actions
because of the intervention carried out by ITC. Accountability rests with management, although
the ultimate ability for achieving intermediate outcomes and impact depends upon the partners and
beneficiaries working in countries. This why when studying the ToC, evaluation is interested in the
strategy that the intervention has followed to support partners and beneficiaries to achieve impact,
while acknowledging higher value results in the results framework, such as outcomes and impact,
might take place outside of the project area of direct management.
39. As a result, while evaluation examines project accountability towards achieving intermediate
outcomes and outcomes planned to contribute to realizing SDGs, it is not limited to the analysis of
the causal attribution of the changes uniquely tied to the intervention within the area of direct
management. 33 This examination includes the assessment of the effectiveness of an intervention
in relation to its support to partners and beneficiaries, and to the observed impact in terms of
improvements (conditions and actions) for partners and beneficiaries.
32
Ibid
33
Contribution is defined as describing the intervention as one of many contributory causes to the outcome, based on a results
chain or theory of change. Source: OECD-DAC 2002.
12
ITC EVALUATION GUIDELINES
4. Evaluation Process
40. Building on the Evaluation Plan, generally, all evaluations at ITC follow the same process as
outlined in Box 2 below.
2. Terms of reference
3. Inception Report
7. Management Response
8. Follow-up on recommendations
Independent Evaluations
41. To pave the way for a useful evaluation, the first step in preparing for an evaluation is an informal
exchange with management to discuss the following points:
• What is to be evaluated?
• What is the stage of the project ToC at the time of the evaluation?
As a result of this dialogue, the evaluation manager writes in full independence, a two-page brief
summarising IEU understanding of the key evaluation elements, which is circulated to management
and other in-house stakeholders.
Self-Evaluations
42. This stage may be applied when the project is being jointly implemented either by more than one
ITC Section or Division, or more than one entity. Consultation is critical to ensure the relevance of
the Self-Evaluation to the expectations of stakeholders. Consultations should support the Self-
Evaluation Manager to identify the evaluation issue(s) of particular interest and key learning
aspects to be outlined in the Terms of Reference (ToR). Canvassing stakeholders’ expectations
helps to ensure the Self-Evaluation is viewed as relevant and useful. Proper coordination with
stakeholders helps to encourage their participation and support during the Self-Evaluation process
and follow-up actions. The following questions may be useful to kick-start a consultation meeting:
13
ITC EVALUATION GUIDELINES
Terms of reference
43. The next step in the preparation of an evaluation is the drafting of the ToR, which should be built
into the Monitoring Plan and the Evaluation Plan, prepared at the project design stage. This
document provides details of the requirements and expectations related to an evaluation and
serves as the basis for a contractual arrangement between the IEU and the evaluator or evaluation
team, in the case of an Independent Evaluation, and between the Project Manager and the
evaluator or evaluation team, in the case of a self-evaluation. 34 Box 3 below provides an indicative
outline of an evaluation ToR. Most of the items developed at the ToR stage are refined and further
developed at the inception and reporting stages of the evaluation process.
Independent Evaluations
44. The IEU drafts the ToR in consultation with management and key stakeholders for comments
before finalization.
Self-Evaluations
45. The Project Manager drafts the Self-Evaluation ToR, which is subsequently checked for quality by
the responsible Division, and commented on by the IEU, and stakeholders. The commenting
process is an opportunity to promote a common understanding and build consensus. IEU provides
support throughout this process and provides guidance to the Project Manager to ensure the quality
of the ToR meets the expected standards.
3. Evaluation approach
5. Evaluation management
Evaluation deliverables
6. Evaluation use
34
Adapted from: Independent Evaluation Group (2011). Writing Terms of Reference for an Evaluation: A How-to Guide,
Washington, World Bank Group. Available from:
http://siteresources.worldbank.org/EXTEVACAPDEV/Resources/ecd_writing_TORs.pdf
14
ITC EVALUATION GUIDELINES
Background
46. The background and context are a summary of the overall concept and design of the evaluation,
including an assessment of its strategic objectives; planned time and resource availability for its
implementation; institutional and management arrangements; and the clarity, logic, and coherence
of the intervention document or concept paper. If project or programme objectives are revised
during project implementation, the project should be assessed against the revised objectives, which
are approved by the respective governing body. 35 The evaluation should examine the results chain
to clarify the ToC to determine whether the pre-conditions for success have been considered and
successfully adapted to the changing context and intervention strategy. If the project ToC is not
well-defined, it will need to be reconstructed and included in the Inception Report; this possibility
should be noted in the ToR.
Evaluation approach
48. The evaluation approach is the conceptual way of designing and conducting the evaluation,
depending on factors which can include the design of the project, when the evaluation is taking
place, the objective of the evaluation, etc. The evaluation approach examines the ToC to verify
the extent to which the theory meets what actually happened. Using the ToC offers several positive
features including:
• It not only questions what did work but also why and how it worked.
• It is comprised of two key parts: conceptual (developing the causal model to guide the
evaluation); and empirical (testing the model to investigate how the project contributed to
intermediate outcomes, outcome, and impact).
• It often proposes the use of mixed methods (quantitative and qualitative) in data
collection. 36
49. While the evaluation approach is delineated at the ToR stage, it is usually finalized at the Inception
Report phase.
35
This step is crucial, since without official approval the evaluation is sometimes obliged to use the initially approved objectives,
which could affect analysis and findings.
36
Adapted from: Governance and Social Development Resource Centre (2012). Helpdesk Research Report: Theory-based
evaluation approach, Birmingham. Available from: http://www.gsdrc.org/docs/open/hdq872.pdf, and Treasury Board of Canada
Secretariat (2012), op. cit.
37
The criteria used in ITC evaluations are consistent with the five evaluation criteria laid out and defined by the Organisation for
Economic Co-operation and Development (OECD) Development Assistance Committee (DAC) Organisation for Economic Co-
operation and Development (2015). DAC Criteria for Evaluating Development Assistance, Paris. Available from:
15
ITC EVALUATION GUIDELINES
EVALUATION CRITERIA
Relevance. It is to assess the consistency of the objectives of an intervention with ITC’s corporate
goals and comparative advantages, the client country’s development strategy or policy priorities, and
the needs of beneficiaries. The adequacy and coherence of the components of the intervention and
the related strategy to achieve those objectives should be assessed as well.
Effectiveness. It is to assess to what extent the intervention’s objectives have been achieved or are
expected to be achieved, taking into account their relative importance.
Efficiency. It is to assess to what extent the intervention has converted its resources and inputs
(funds, expertise, time, etc.) economically into results (i.e., the results chain, ToC and intervention
strategy).
Impact. It is to measure changes that have occurred or are expected to occur for the partners and
beneficiaries, and to indicate the positive or negative, direct or indirect, intended or unintended,
medium- to long-term results caused by the interventions. 40 The impact domains aligned to the UN
2030 SDGs are considered in assessing impact.
Sustainability. It is to assess the likelihood of continued long-term benefits of the interventions and
the resilience to risk of net benefit flows over time.
CROSS-CUTTING DIMENSIONS
Human Rights and Gender Equality. This is to assess whether human rights and gender equality
are sufficiently embedded in the intervention, and the extent to which the intervention has contributed
to their enhancement. 41
Environment and climate change. This is to assess, in the trade development context, to what
extent the interventions have contributed to protection and rehabilitation of natural resources and the
environment, and to climate adaptation and resilience.
Innovation. This is to assess to what extent the intervention has introduced innovative approaches
to achieve ITC’s goals or better adapt to emerging contexts, and the innovations have been replicated
or scaled up by development partners.
16
ITC EVALUATION GUIDELINES
Evaluation management
Independent Evaluations
51. The IEU is responsible for managing Independent Evaluations at ITC. All evaluations undertaken
by the IEU are collaborative and include project management and stakeholders throughout the
evaluation process. To ensure participation and ownership among key stakeholders, regular
consultations will be conducted during the evaluation process. The main clients shall be involved
in commenting on the draft deliverables including ToR, Inception Report and draft Evaluation
Report.
52. ITC management ensures that the IEU has timely and sufficient access to information needed for
conducting Independent Evaluations and that the operational managers of the Divisions,
programmes, and projects actively cooperate with the IEU and participate in the evaluation
processes.
53. As part of the management of the evaluation, the ToR focus on the composition of the evaluation
team. The team includes a lead evaluator, designated by the Head of the IEU, who can be either
a staff member or an external expert recruited as an evaluation consultant. Depending on the
complexity and needs of the evaluation, the team may also include other team members, such as
a research analyst and/or an associate evaluator, or other ITC staff members working as thematic
specialists for the evaluation.
In general, the recruitment of evaluation consultants is completed in the ToC preparation and
consultation phase. In certain cases, consultants may be hired during successive evaluation
phases, depending on the situation and needs. The Head of IEU is responsible for choosing the
team, including recruitment of evaluation consultants who must sign non-disclosure agreements to
avoid possible conflicts of interest. 42
54. Sometimes, a preliminary mission may be undertaken to prepare the stakeholders in the field,
explain the ToR to national partners, establish initial connections with stakeholders on site, clarify
their concerns on the upcoming evaluation, and make necessary agreements for the main mission
to follow.
Self-Evaluations
55. A Self-Evaluation is an evaluation carried out by those who are entrusted with the design and
delivery of a development intervention’. 43 Self-Evaluations are conducted according to the
procedures set out below and in compliance with ITC Evaluation Policy. The Divisional Director
holds oversight responsibility for Self-Evaluation exercises; he/she designates a manager
responsible for conducting the exercise. In principle, the Self-Evaluation Manager should be the
Project Manager of the project being evaluated. However, due to workload constraints or other
reasons, management of the Self-Evaluation may be delegated to another staff member within the
same project/programme from a different Section or Division. Key features of a Self-Evaluation
include:
Autonomy
• Use in the decision-making of lessons learned and good practices, and possible
recommendations, at the discretion of management
42
International Trade Centre (2015). ITC Evaluation Policy, Second Edition, Geneva, pp. 8-9. Available from:
http://www.intracen.org/uploadedFiles/intracenorg/Content/About_ITC/How_ITC_Works/Evaluation/ITC-Evaluation-Policy-
2015-Final.pdf
43
Organisation of Economic Co-operation and Development (2010), op. cit.: 35.
17
ITC EVALUATION GUIDELINES
Light process
• ToR and final report are the only formal deliverables; the Inception Report and draft
Evaluation Report can be informal documents
• Key learning messages extracted from Self-Evaluations consolidated and presented in the
AESR
56. The IEU and respective Division should acknowledge the workload of the Self-Evaluation Manager
and quality of the deliverables. The Self-Evaluation Manager is responsible for managing the
budget for the evaluation exercise. He/she also determines the evaluation users, objectives to
address, evaluation issue(s) of particular interest, scope, methodology, and timing. In addition, the
Self-Evaluation Manager should choose the relevant learning and accountability aspects of the
evaluation. The use of Self-Evaluation findings in decision-making is determined by management.
Accordingly, IEU does not follow-up on the implementation of Self-Evaluation recommendations.
57. IEU provides customized advice to the Self-Evaluation Manager on evaluation planning, methods,
and the drafting of the terms of TOR, Inception Report and final Evaluation Report. For quality
enhancement purposes, IEU provides technical comments on an advisory basis concerning the
main Self-Evaluation deliverables, including draft TOR, Inception Report and final report.
58. In addition to these advisory services, IEU conducts a quality review of Self-Evaluation ToR and
final reports to assess whether these documents meet the required standards. In terms of the Self-
Evaluation ToR, the review is conducted in line with the criteria described in the quality checklist in
Table 7 below. The IEU also plays a role in promoting lessons learned and good practices
generated from Self-Evaluations. The findings and results of Self-Evaluation Reports will be taken
into account in the ITC AESR.
59. Should it be decided to hire an evaluation consultant or evaluation team, supplementary ToR will
be required. These can be based on the Self-Evaluation ToR (discussed further below) but will
include the specific requirements for individual consultants that will be used during the hiring
process. The hiring of evaluation consultants should take place as soon as the agreement is
reached on the draft ToR. The Self-Evaluation Manager is responsible for the hiring of
18
ITC EVALUATION GUIDELINES
consultant(s) and/or building an evaluation team, should it be required. IEU is available to assist
in identifying experienced evaluation consultants. All hiring procedures should be completed prior
to the desk review.
60. In accordance with the Evaluation Policy, below are the rules to use in order to avoid conflict of
interest in Self-Evaluation when hiring consultants:
• ITC management should ensure that the Self-Evaluation Manager and Self-Evaluation
consultants will not be subject to any form of undue influence at ITC, such as partial
information, bias against or in favour of certain stakeholders, retaliation, and actual or
perceived threats in relation to the professional judgments made by the Self-Evaluation
team.
61. Should an evaluation team be established, the Self-Evaluation Manager should arrange regular
briefing sessions with the team to discuss progress, achievements, problems, further steps, timing,
etc. Team coordination helps to cross-check sources of information, assess the strength of the
factual base, and identify the most significant findings.
Evaluation use
62. Communication and Learning Plan is an integral component of the evaluation ToR. A
dissemination plan should be included, where learning events are identified, and follow-up activities
set out. The Communication and Learning Plan is built on throughout the evaluation process. It
should be noted that communication products for a Self-Evaluation, such as regular updates or
evaluation communication notes, may be prepared for dissemination according to the information
needs of stakeholders.
64. Based on the desk review and evaluability assessment (discussed below), the Inception Report
should tentatively clarify the intervention results chain, ToC, and strategy, and provide early
findings to be further analysed and substantiated during the data collection phase. Special
attention should also be put on an analysis of the changing environment and its consequences.
19
ITC EVALUATION GUIDELINES
• Results chain;
• Assumptions 45, risks 46 and, in some cases, mechanisms 47 associated with each link in the
logical framework and/or the results chain;
• Any empirical evidence supporting these assumptions, risks and external factors.
68. The evaluator validates or reconstructs the ToC with empirical evidence and an account of major
external influencing factors. Since the ToC provides the basis for arguing that the intervention is
making a difference and the weaknesses in the project logic have been identified, this helps to
identify where evidence for strengthening the claim of ‘making a difference’ is most needed. To
summarize, the contribution claim is equal to the verified ToC in addition to the other key influencing
factors. 49
Inception Report
69. An Inception Report of an evaluation is prepared by the evaluator or evaluation team after an initial
desk review of the relevant project documentation has been carried out. The Inception Report sets
out the conceptual framework to be used in the evaluation, the evaluation questions and
methodology, including information on data sources and collection, sampling and key indicators.
The Inception Report also includes a timeline for the evaluation and drafts of data collection
instruments. 50 The Inception Report is one of the key deliverables in the evaluation process; it is
44
Adapted from: Department for International Development (2013). Practical Approaches to Theories of Change in Conflict,
Security, and Justice Programmes, London. Available from: https://www.sfcg.org/wp-content/uploads/2014/04/PartII-Theories-
of-Change-for-Monitoring-and-Evaluation-With-Annexes-SFCG.pdf
45
Assumptions are defined as ’key events or conditions that must occur for the causal link to happen’. Source: Treasury Board
of Canada (2012), op. cit.: 6
46
Risks are defined as the ’influences or events outside the intervention that may inhibit the causal link from happening’. Source:
Ibid.
47
Mechanisms are defined as ’the causal processes that enable the program to produce results’. Source: Ibid.
48
External factors are defined as ’circumstances beyond the control of the program, such as social, political or economic context,
which may affect the program’s ability to achieve an intended result’. Source: Ibid.
49
Ibid.
50
Adapted from: Better Evaluation (2017). Inception report. Available from: https://www.betterevaluation.org/en/evaluation-
option/inception_report
20
ITC EVALUATION GUIDELINES
shared with project stakeholders for comments and feedback at the draft stage and circulated to
all at the final stage. 51 An indicative outline of an Inception Report 52 is presented below in Box 6.
2. Evaluation Framework
3. Evaluation Methodology
4. Workplan
Develop a timeline
5. Logistics
6. Appendices
Terms of Reference
Introduction
70. With the use of the relevant documentation used during the desk study, the introduction of the
Inception Report should provide a description of the project or programme being evaluated.
Evaluability Assessment
71. An evaluability assessment helps to determine if the project or programme designed well-enough,
so it can be successfully implemented to achieve results. Evaluability is defined as the extent to
which an activity or project can be evaluated in a reliable and credible fashion. 53 The evaluability
51
United Nations Office on Drugs and Crime (2014). Guidelines for Inceptions Reports, Vienna. Available at:
https://www.unodc.org/documents/evaluation/Guidelines/UNODC-IEU_Inception_Report_Guidelines.pdf
52
Adapted from: United Nations Educational, Scientific and Cultural Organization (2008). Guidelines for Inception Reports,
Paris. Available at: http://unesdoc.unesco.org/images/0015/001583/158397e.pdf
53
Organisation for Economic Co-operation and Development (2010), op. cit.: p. 21.
21
ITC EVALUATION GUIDELINES
assessment is used to advise on the use of the evaluation resources, evaluation criteria, data
collection, analysis methods and field visits. 54
Independent Evaluations
72. It is performed during the Inception Report stage and examines the design (focusing on the ToC),
inception (focusing on the ToC and data availability), and implementation stages (focusing on the
ToC, data availability, and stakeholders) of a project and is based on the following three groups of
indicative questions: 55
1. Does the quality of the project design allow for the evaluation?
• Was monitoring data planned and collected on a regular basis against the performance
indicators?
• Does the timing of the evaluation fit into the programme cycle (usefulness of evaluation
at that point in time)?
• Can external factors (political, climatic, security, etc.) hamper the evaluation?
• Are key stakeholders available for interviews in the field and in headquarters during the
planned evaluation time period?
73. In the case of most of negative answers in any of the above groups of questions, it should be noted
in the Inception Report as this will help to describe the limitations that may be encountered during
the evaluation. The methodological limitations of an evaluation should be acknowledged in the
Inception Report to avoid any misleading interpretations. These limitations often relate to data
availability, sampling size and methods, access to informants, potential survey response rates and
maturity of changes to show clear impact. Having this information in the report is a means to inform
stakeholders.
Self-Evaluations
74. Project Managers are encouraged to carry out evaluability assessments for Self-Evaluations which
can help to organize project documentation as well as determine the extent to which the project
can be evaluated. It should be noted that this is not mandatory for Self-Evaluations, but it is strongly
encouraged as it offers an opportunity to discuss any risks or limitations that may undermine the
reliability and validity of the evaluation results at an early stage.
54
In some cases, the evaluability assessment might require field missions to evaluate the readiness of key stakeholders for
conducting the evaluation and the availability of data, and to identify information gaps and gather early findings.
55
Adapted from: United Nations Office on Drugs and Crime (2012). Evaluability Assessment Template. Available from:
https://www.unodc.org/documents/evaluation/Guidelines/Evaluability_Assessment_Template.pdf
22
ITC EVALUATION GUIDELINES
75. The purpose and scope of the evaluation is usually set out in the ToR. In this section, also include
the complete set of evaluation questions and elaborate on them if required. If there were
preliminary discussions with project teams, and any additional questions were identified, they
should be included here. In addition, if any questions set out in the ToR are deleted, this must be
mentioned with a reason as to why they have been excluded.
Evaluation use
76. The evaluation users may have been initially identified at the ToR stage. However, it is important
to further define the primary intended users in the Inception Report as this will increase the
likelihood of the evaluation being completed and used in an appropriate way. Another element to
be included in the Inception Report is evaluation follow-up, which should identify who will be
responsible for following the evaluation process to ensure its use. Further discussion regarding
evaluation follow-up is found below.
Evaluation framework
Evaluation approach
77. Discuss the overall approach of the evaluation; this should incorporate an analysis of the ToC of
the project or programme. The ToC of the project or programme should be included here. If it has
already been created it can be taken from the project documentation. If it does not exist, it should
be reconstructed as described above. This can be an initial reconstruction, which can be built on
during the course of the evaluation. The project management should also be described.
78. All components of the project or programme being evaluation should be discussed, as well as all
assumptions, risks, and limitations. Including a discussion on the risks and limitations helps to
identify elements which may undermine the reliability and validity of the evaluation results.
Evaluation methodology
79. Discuss the data collection and data analysis methods that will be used for the evaluation. State
the limitations of each method and include the level of precision required for quantitative methods
and value scales or coding used for qualitative methods. ITC evaluations adopt a mixed-methods
approach and include questionnaires, interviews, focus groups, surveys, document review, and
observation. The evaluation questions should be addressed with the appropriate information-
gathering techniques, whether they be qualitative or quantitative. Data collection methods should
be adequately tailored – as needed – to reflect the specific circumstances and applied in relation
to the evaluation approach (see Table 3 below).
80. To enhance the validity and credibility of evaluation findings, the Inception Report plans for
triangulation of the information obtained from different methods of data collection. Triangulation is
a data analysis technique to cross-check and verify data and evidence collected from different
sources. Triangulation is used to indicate that more than three independent methods have been
used to check the results of the same project. When using either qualitative and/or quantitative
data collection, triangulation provides a confirmatory measurement. 56 The use of triangulation
requires multiple data collection techniques, which help contribute to the reduction of bias.
Annexes II and III provide further details on how to deal with frequent difficulties related to data
collection, and how to cope with some common data collection biases.
56
Adapted from: Scriven, Michael (1991). Evaluation Thesaurus, Fourth Edition. Newbury Park: Sage Publications, pp. 364-
365.
23
ITC EVALUATION GUIDELINES
81. In planning for data analysis, key deliverables can include thematic working papers, learning notes
and presentations, which should be identified in the Inception Report. These additional
deliverables help to illustrate the detailed data and analysis, and can be used as internal
documents, which should be filed for accountability purposes to enable future tracking as needed.
Data Sources
82. The key data sources that will be selected to answer each of the evaluation questions should be
presented. Common sources include project or programme beneficiaries, implementing partners,
project stakeholders, key informants, and project or programme documents, records, databases,
etc.
Sampling methods
83. The sampling methods to be used during the evaluation should be described in detail. This should
also include the area and population to be represented, rationale for selection, mechanics of
selection, sample size, sample precision and confidence and limitations.
84. Should field visits be required during the data collection phase of the evaluation process, prior to
the end of the field mission, a debriefing session should be held with key national stakeholders to
present the preliminary findings and solicit early feedback. This discussion helps to facilitate
evidence-based consensus building with and among country stakeholders. It is important that
country stakeholders know that preliminary findings are not conclusive, as they are subject to
modification when additional information and feedback becomes available. A mission reporting
note, or a Note for the File (NFF), depending on the depth of analysis, could be prepared to
summarize the key findings and the plan for next steps. This is also important to align the evidence
collected during field missions, to the objectives of the evaluation and the structure of the
Evaluation Report. Some form of mission reporting note is requested for each mission.
85. Building on the proposed evaluation questions set out in the TOR, the evaluation methodology
should be summarized in an evaluation planning matrix containing the following column headings:
evaluation criteria, evaluation issues, possible evaluation questions, data collection sources, and
data collection methods. An evaluation matrix is a tool that presents the evaluation questions
against evaluation methods that evaluators or evaluation teams have selected to collect factual
data paving the way to the analysis of the findings using triangulation. Against each evaluation
criterion, key questions are tailored to the objectives of the evaluation and the operational context.
Tables 3 and 4 below provide sample evaluation matrixes, as well as sample questions, data
sources, and data collection methods related to the evaluation criteria and cross-cutting dimensions
described above.
24
ITC EVALUATION GUIDELINES
57
This is not an exhaustive list of data collection methods, adapted from: International Fund for Agricultural Development (2015), op. cit.
25
ITC EVALUATION GUIDELINES
Evaluation
Evaluation issues Possible evaluation questions Data sources and methods
criteria
Relevance Conformity to ITC’s − Were the objectives and design of the intervention in line with ITC’s corporate objectives, Desk review of
mandate; relevance to strategies, strengths and comparative advantages? documentation, interviews at
strategic objectives; − Was the positioning of the intervention against competitors aligned to ITC objectives, headquarters, etc.
alignment to ITC’s strategies, strengths and comparative advantages?
comparative advantages
Relevance to the needs of − To what extent does the intervention fit in with the policies and plans of the government Desk review, interviews in the
clients; participation of and other development partners in the country? field and with key national
partners and beneficiaries − Were the objectives and design relevant to targeted partners’ and beneficiaries’ needs and partners, survey, national
in the intervention priorities? statistics, national trade and
− Have partners and beneficiaries bought and participated in the intervention? export development
strategies, etc.
Rationale, coherence, and − Did the intervention design benefit from available knowledge (e.g. recent evaluations, Review of M&E reports and
adaptability of the lessons from similar interventions)? past evaluations, interviews,
intervention design and − Was there coherence in the intervention strategy to link causal effects between activities, survey, national statistics,
implementation outputs, and intermediate outcomes? national development
− Was the role of partners and beneficiaries to achieve intermediate outcomes and impact strategies, etc.
well defined?
− Were all the relevant pre-conditions necessary to achieve intermediate outcomes and
impact identified and taken into account in design and inception?
− Was the design appropriately adapted to changing context?
− Is the project grounded in an appropriate, well-articulated and valid project ToC? 58 Does
the theory reflect recent research?
− Are the project-targeted stakeholders, design, operation and intended outcomes
consistent with the project ToC?
Strength and effects of − What kind of internal and external partnerships existed? Review of documentation,
internal and external − What kind of coordination and support mechanisms were in place to support partnership interviews with cooperation
partnerships and the achievement of common goals? partners, survey, etc.
− How well did they work to support the achievement of intermediate outcomes and impact?
Effectiveness Effectiveness of − Was the intervention effective in transforming outputs into support to partners and Review of M&E reports,
intervention strategy to beneficiaries in terms of increased awareness and interest; increased knowledge, skills, interviews at headquarters
realize intermediate and exchange; and improved consensus and feasible plans to act? and in the field, survey,
outcomes − Was the intervention effective in supporting partners and beneficiaries to achieve national and local statistics,
intermediate outcomes? national and local social and
58
Note that all theory of change based evaluation questions are adapted from: Stufflebeam, D. L. (2001). Evaluation Models. New Directions for Evaluation, No. 89 Jossey-Bass, San Francisco.
26
ITC EVALAUTATION GUIDELINES
Evaluation
Evaluation issues Possible evaluation questions Data sources and methods
criteria
− If the activities are not yet completed, is it likely that so far unattained objectives may be economic development
accomplished in full or in part over time? reports, etc.
− What major changes in the overall context (e.g. policy framework, political situation,
institutional set-up, economic shocks, civil unrest) have affected or are likely to affect
project implementation and overall results?
− Are the project inputs and operations producing outcomes in the way the project ToC
predicted?
− What changes in the project’s design or implementation might produce better outcomes?
Efficiency Adequacy of staff and − What are the resources and costs to develop specific outputs? Review of financial and
resources to achieve − Do management team members have the necessary skills and expertise? Is the required performance records, time
deliverables and capacity-building in place to mitigate gaps? records and outsourcing
objectives − Are financial resources available at the time they are needed? Are they sufficient to selection records.
achieve the objectives?
− Were outputs and services delivered on time? Was the implementation period extended?
− Has the team obtained adequate cooperation from internal ITC partners?
− Has the team worked with competent subcontractors?
Quality and adequacy in − Were outputs produced at a reasonable cost and with acceptable quality? Review of documents, field
output production, − Was the administrative cost comparable to that of other development partners? interviews, interviews with
particularly against − What does the analysis of the M&E data tell us in terms of performance? cooperation partners, survey,
qualitative and quantitative − What other factors help account for project efficiency performance? national statistics, national
targets. development strategies, etc.
Quality and adequacy of − Did the M&E system provide for efficient project management and accountability?
monitoring and evaluation − Did it successfully enhance the evaluability of the intervention?
(M&E) system
Impact Partners’ and − To what extent have partners and beneficiaries changed their own situation and improved Review of M&E reports,
beneficiaries’ attainments their actions as a result of intervention? interviews, survey, national
of outcome and impact as − To what extent have partners and beneficiaries influenced other actors resulting in an statistics, national social and
a result or through the improved overall situation with the support of the intervention? economic development
support of ITC − Can any other positive or negative effects be observed as a consequence of the project results, etc.
intervention. interventions? What? Why?
Sustainability The extent to which − Do partners and beneficiaries have sufficient abilities and influence necessary to generate Review of documents, M&E
partners and beneficiaries impact? reports, interviews in the field
are enabled, committed − Was a specific exit strategy prepared and agreed upon by key partners to ensure post- and with key national and
and likely to contribute to intervention sustainability? local partners, survey,
ongoing benefits beyond − What is the likelihood that partners and beneficiaries will be in a position to continue national development
the intervention generating results and benefits after intervention completion? What factors are in favour strategies, etc.
of or against maintaining benefits?
27
ITC EVALUATION GUIDELINES
Evaluation
Evaluation issues Possible evaluation questions Data sources and methods
criteria
− Is there a clear indication that the government and/or other key partners have committed
financial and human resources to maintain benefits and results? Are the target groups and
their organizations prepared to continue the activities and benefits?
− What other factors account for the assessment of sustainability?
59
Note that all of the evaluation issues and possible evaluation questions provided for human rights and gender equality are extracted from United Nations Development Group (2014), Integrating
Human Rights and Gender Equality in Evaluations, pp. 77-79. Additional information for evaluation questions to assess design and planning, implementation and results can also be found on pages
81 to 85.
28
ITC EVALAUTATION GUIDELINES
inclusion, non-
discrimination,
accountability).
The HR and GE − Was there a provision of adequate resources for integrating HR and GE in the intervention Review of documents, M&E
dimensions of efficiency as an investment in short-, medium- and long-term benefits? reports, field interviews,
require a broader analysis − What are the costs of not providing resources for integrating HR and GE (e.g. enhanced survey, national statistics,
of the benefits and related benefits that could have been achieved through modest investment)? national employment
costs of integrating HR protection strategies, etc.
and GE in interventions. − What is the extent to which the allocation and use of resources to targeted groups take into
account the need to prioritize women and individuals/groups who are marginalized and/or
discriminated against?
To assess the − Has the project contributed towards developing an enabling or adaptable environment for Review of documents, M&E
sustainability of results real change on HR and GE? reports, field interviews,
and impacts on HR and − Is the institutional change conducive to systematically addressing HR and GE concerns? survey, national statistics,
GE, the extent to which an national employment
intervention has advanced − Has the capacity of targeted rights holders and duty bearers to respectively demand and protection strategies, etc.
key factors that need to be fulfil rights been developed?
in place for the long-term
realization of HR and GE
should be studied.
HR and GE results can be − Have rights holders been able to enjoy their rights? Review of documents, M&E
defined as the actual − Is there a real change in gender relations (e.g. access to and use of resources, decision- reports, field interviews,
realization and making power, division of labour)? survey, national statistics,
enjoyment of HR and GE national employment
by rights holders. It is − Is there permanent and real attitudinal and behavioural change conducive to HR and GE? protection strategies, etc.
the real change (positive − Is there a redistribution of resources, power, and workload between women and men?
or negative, intended or
− Are there effective accountability mechanisms operating for HR & GE?
unintended, primary or
secondary) in HR & GE
that is attributable to an
intervention.
Innovation The success of the − What are the characteristics of innovation(s) promoted by the project that may benefit other Review of documents, M&E
intervention strategy and AfT interventions? Are the actions in question truly innovative or are they well established reports, interviews in the field
results to promote elsewhere but new to the country or project area? and with key national and
innovation, replication and − Were successfully promoted innovations documented and shared? local partners, survey,
scaling up, especially for national development
small-scale projects. − Have the intervention approaches and innovations been replicated or scaled up by other strategies, etc.
partners in the same or other countries?
− What elements of the project ToC are essential for successful replication?
29
ITC EVALUATION GUIDELINES
Environment The contribution to − Has the intervention led to changes in the environment and natural resources protection Review of documents, M&E
and climate changes in the protection and rehabilitation through trade support interventions? reports, field interviews,
change and rehabilitation of − What activities have been taken into consideration of climate adaptation and resilience, survey, national statistics,
natural resources and the and what are the results? national environmental
environment, and to development strategies, etc.
climate adaptation and
resilience
30
ITC EVALUATION GUIDELINES
Workplan
86. The Inception Report should provide a timeline outlining the evaluation phases (data collection,
data analysis, and reporting). In addition, the key deliverables and milestones should also be
identified. In addition to the timeline, specific responsibilities for each phase of the evaluation
process should be identified. This should include all stages and who is responsible for what. For
example, when the draft report is to be quality reviewed it would be with the IEU.
Logistics
87. This section should provide a summary of the logistics required to carry out the evaluation. It
should include all elements required for the data collection phase, such as field visits,
transportation, communication requirements, etc.
Appendices
88. Appendices that should be included with the Inception Report include all relevant draft data
collection instruments (surveys, questionnaires, and interview guides), and the ToR. Any other
documents deemed relevant should also be included at this time.
Ethical behaviour
89. In line with the ITC Evaluation Policy, and UNEG guidelines 60 and code of conduct for evaluation 61,
evaluators and evaluation teams should adhere to the principles of ethical behaviour throughout
the evaluation process and particularly during data collection. Evaluation team members should
ensure that they are familiar with and respectful of local beliefs, manners, and customs.
Interviewers must respect people's right to provide information in confidence and ensure that
sensitive data cannot be traced to its source. Attempts should be made to minimize demands on
interviewee time. Evaluators and evaluation team members also have a responsibility to bring to
light issues and findings that do not relate directly to the evaluation ToR. At times, evaluators or
evaluation teams can uncover suspicious practices or evidence of wrongdoing; if this happens,
what, how and to whom these issues are reported should be discussed with the Evaluation
Manager, who must also inform ITC management.
60
United Nations Evaluation Group (2008). UNEG Ethical Guidelines for Evaluation, New York. Available from:
http://www.unevaluation.org/document/detail/102
61
United Nations Evaluation Group (2008). UNEG Code of Conduct for Evaluation in the UN System, New York. Available from:
http://www.unevaluation.org/document/detail/100
31
ITC EVALUATION GUIDELINES
Box 7: Definitions for data, evidence, findings, conclusions, and lessons learned 62
Data: Any piece of qualitative or quantitative information that has been collected by the evaluation team is
called data. For example, Document X indicates that the vast majority (90%) of surveyed clients are
satisfied with the project services.
Evidence: A piece of information, or data, is qualified as evidence as soon as the evaluation team
assesses it as reliable enough. For example, Document X, quoting Ministry Y data that is considered
reliable, indicates that 85% of project beneficiaries or users of an output are satisfied with the service.
Findings: Findings establish a fact derived from analysis of the evidence. Findings do not include value
judgments. For example, the quality of service delivery has improved.
Conclusions: Conclusions point out the factors of success and failure of the evaluated intervention, with
special attention paid to the findings (i.e. intended and unintended results and impacts; any other strength
or weakness). A conclusion draws on data collection and analyses (undertaken, through a transparent
chain of arguments), evidence and findings from the evaluation process. For example, the project has
contributed to enhancing the exports of women entrepreneurs in the coffee sector.
Lessons Learned: Lessons learned are generalizations based on the evaluation process; they relate
logically to the evaluation findings and interpret the findings and conclusions of the evaluation in relation to
wider concerns. They are conclusions that can be transferred to the next cycle(s) of the same intervention
or to other similar interventions. For example, sustainability is rooted in the successful transfer of capacity,
skills and competencies to beneficiaries and partners in a way that fosters local ownership and long-term
commitment.
93. Findings should resist criticism based on weak evidence or subjective views, and therefore need
to be supported with convincing facts and reliable evidence (i.e. converging facts, records and/or
statements). Particularly, data and views collected during interviews and observations should be
corroborated with reliable data from other sources before the data are applied in the Evaluation
Report. If data gaps are identified, the evaluation team may need to collect additional information
to establish a robust analysis. In evaluation analysis, four levels of data strength can be
considered, as presented in Table 6 below.
Type of
Description
evidence
Observed fact Factual evidence is the strongest.
Observed facts can be in the form of visit reports, photographs, management records or any
kind of traceable material.
Witness Still considered very strong evidence; for example, beneficiaries in a training programme state
statement that they have changed their attitudes after participating in the program
Proxy This type of evidence is also called circumstantial evidence; for example, during the past few
months, several competitors of a subsidized firm collapsed, which indicate that the level of
support was excessive and distorted competition.
The strength of this type of evidence depends upon the strength of the logical reasoning
supporting the inference.
Reported An indirect statement is the weakest type of evidence; for example, programme managers
statement state that beneficiary enterprises have strongly improved their competitiveness.
The strength of this type of evidence depends upon the authoritativeness of the expert whose
statement is used.
62
Adapted from: World Intellectual Property Organization (2009). Self-Evaluation Guidelines, Geneva. Available from:
http://www.wipo.int/export/sites/www/about-wipo/en/oversight/iaod/evaluation/pdf/self_evaluation_guidelines.pdf; and the
Organisation of Economic Co-operation for Development (2010), op. cit.
63
World Intellectual Property Organization (2009), op. cit.: 29.
32
ITC EVALUATION GUIDELINES
Table 7: Quality checklist for evaluation terms of reference and Inception Reports 64
- Relevant aspects of the economic, trade, social and political context of the intervention
- Objectives of the project, ToC (intervention logic) and activities
- Management arrangements, challenges, and changes
- Progress and outputs
- Based on a comprehensive desk review, the Inception Report may need to further develop the above
contents included in the ToR.
- General methodological approach and design for data collection and analysis should be included in the
ToR.
- TOR includes evaluation criteria to be used by the evaluation (e.g. relevance, effectiveness, efficiency,
impact and sustainability, innovation and scaling up, women’s empowerment and gender equality, human
rights).
- Main evaluation questions related to each evaluation criterion should be included in the ToR.
- Based on a comprehensive desk review, the Inception Report finalizes the evaluation criteria and further
develops specific questions.
- The detailed data collection and analysis methods should be presented in the Inception Report.
64
Adapted from: United Nations Evaluation Group (2010). UNEG Quality Checklist for Evaluation Terms of Reference and
Inception Reports, New York. Available from: http://www.unevaluation.org/document/detail/608
33
ITC EVALUATION GUIDELINES
- Roles and responsibilities for the Project Manager and other evaluation team members
- Roles and responsibilities of the stakeholders and IEU in the evaluation process
- The ToR should also include general profiles of evaluation consultant(s) for recruitment or service
procurement.
- If major revisions needed, the Inception Report will revise the above contents included in the ToR.
TENTATIVE ROADMAP
- Roadmap of the evaluation, including the anticipated date for each key deliverable
- If major revisions needed, the Inception Report will provide a revised roadmap.
BUDGET ESTIMATE
- The ToR should include an estimated budget for consultancy and other evaluation-related services.
- If needed, the Inception Report will finalize the budget estimate in the ToR.
34
ITC EVALUATION GUIDELINES
6. Evaluation report
95. Based on the approved Inception Report, the evaluator or evaluation team conducts the evaluation
and prepare a draft report. 65 It is the responsibility of the Evaluation Manager to ensure that the
draft Evaluation Report is prepared. Once all data have been analysed and evaluation results
compiled, the worth of an intervention can be assessed. Judging the achievements of a project
involves drawing out conclusions, lessons learned and recommendations. To formulate
conclusions, the evaluator or evaluation team applies the rating system to the evaluation criteria
agreed upon in the ToR and Inception Report. Data collection and analysis are structured
according to these criteria and key learning aspects. An indicative outline for an Evaluation Report
is found in Box 8 below.
1. Introduction
Description of the project design, ToC, management arrangement, M&E system, and
major changes in design
4. Recommendations
5. Conclusions
7. Annexes:
Audit Trail; ToR; organizations and places visited, and persons met; data collection
instruments; any other relevant materials
Executive summary
96. The Evaluation Report starts with the executive summary which presents, a brief overview of the
purpose, objective, scope, methods of the evaluation, the major findings, lessons learned, and
recommendations, and a summary of the conclusion stemming from the ToC analysis, in a concise
manner. The credibility of evaluation findings is largely based on the rigour of its data collection
65
In principle, the final report will not include acknowledgements at the beginning and no specific names are mentioned.
35
ITC EVALUATION GUIDELINES
methods and the robustness of its analysis, a clear evidence trail is considered obligatory to
present the required linkages.
97. An evidence trail is provided in a matrix and links findings, conclusions, and recommendations.
Each recommendation should be clearly anchored in the specific conclusions presented in the
report, and in turn, each conclusion is coherently generated from the specific analysis and evidence
presented in the same report (see Table 8 below). A clear evidence trail is considered obligatory
in presenting evaluation findings, conclusions, and recommendations. The conclusions should
provide clear answers to the questions asked in the Inception Report. As a good practice, the
conclusions should be organized in clusters to formulate coherent and inclusive recommendations.
The lessons are conclusions that can be transferred to the next cycle of the same intervention or
to other similar interventions in the near future. The report should also include a self-assessment
of the methodological limitations that may restrict the range or use of certain conclusions. The
evidence trail should appear at the end of the executive summary.
1.
2.
3.
4.
99. The purpose, objective and scope of the evaluation, which may have already been set out in the
ToR should be included here. Any new details that may have come to light during the evaluation
process regarding the project’s background and context should also be included at this time.
100. A discussion of the methodologies used in the evaluation and how they were applied should also
be included in this section. This should include an overview of the quantitative and qualitative
methods applied (including an explanation of the number of the persons included per method, as
well as criteria for selecting the project locations etc.). Techniques used during data collection and
processing of data and information (e.g. data triangulation) should be mentioned as well. The
methodologies used in the evaluation include any information as to how the data was collected,
the data sources (e.g. primary data from interviews, surveys, and questionnaires, and secondary
data), and the approach used to analyse the data. The Evaluation Report also mentions possible
restrictions (e.g. the non-availability of key informants) by using the methods as well as possible
resulting effects on the evaluation, particularly its independence. This section is important as it
provides the basis for the credibility of the evaluation results.
36
ITC EVALUATION GUIDELINES
102. At the project report stage, the use of the ToC emerges throughout the entire Evaluation Report.
Keeping the ToC in mind is vital for conducting the analysis leading to findings, the distillation of
conclusions and the issuance of useful recommendations. The causality required for assessing
that the intervention has successfully attained its goal is inferred from the following evidence:
• The intervention was based on a reasoned ToC; the results chain and the underlying
assumptions of why the intervention was expected to work were sound, plausible and
agreed to by key players.
• The ToC has been verified by evidence; the chain of expected results occurred, the
assumptions held, and the (final) outcomes were observed in terms of the value added by
ITC to its partners and beneficiaries.
• External factors (context) influencing the intervention were assessed. They show not to
have made a significant contribution, or if they did, their relative contribution was
recognized and, addressed to the extent it was feasible.
• In the end, a conclusion (a contribution claim) is made about whether the intervention has
made a difference in terms of the improvement facilitated by ITC of the situation and
abilities of its partners and beneficiaries.
104. However, although the use of evaluation criteria is necessary for a systematic and unified analysis
of data collected during the evaluation process, they can present a series of challenges when
evaluating complex interventions such as large programmes, a strategy, a policy or a corporate
approach. In these cases, an interpretation of the criteria may not always be in a linear logic; there
may be a limited ability to reflect the complexity and synergetic effects, and gaps may become
evident in some key themes raised under Agenda 2030. 66
105. Evaluation adopts an integrated approach across the evaluation criteria with a focus on the efficacy
of the strategy that was deployed to address the intervention ToC. It includes the examination of
the appropriateness of overall management arrangements and how these have impacted the
intervention in terms of the value added by ITC to its partners and beneficiaries and concerning
the positive change made by the intervention as per the improvement facilitated by ITC of the
situation and abilities of its partners and beneficiaries. Since expected changes are related to
sustainable development, the analysis expresses change in terms of ITC’s contribution to the
SDGs. Also examined with the same lengths are the coordination, collaboration and support
arrangements with partners and beneficiaries and with other stakeholders.
106. In conducting the assessment of the achievements throughout the intervention results chain, the
discussion initiates with the examination at the output level. Regarding the implementation of
activities, 67 it addresses how this was undertaken, noting any constraints, and examining if and
66
DAC Network on Development Evaluation; Summary of the workshop on OEC DAS Evaluation Criteria: Progressing the
dialogue, Organisation for Economic Co-operation and Development (2018)
67
Activity is defined as ’Actions taken or work performed through which inputs, such as funds, technical assistance and other
types of resources are mobilized to produce specific outputs’. Source: Organisation for Economic Co-operation and Development
(2010), op. cit.: 15.
37
ITC EVALUATION GUIDELINES
how the monitoring and backstopping was done during implementation. This is done with a view
to drawing lessons from the experience.
107. The achievement of planned intermediary outcomes indicates the extent to which the planned
outputs 68 were delivered and how the intervention used the short-life cycle of outputs to attain the
corresponding results 69 within the long-life cycle of the results chain (see Figure 3 above). It also
includes how intermediary outcomes were achieved, or not, within the planned time-frame and the
resources available.
108. The examination further shows if and how the intermediate outcomes have either been achieved,
or not, with a view of demonstrating whether the intervention strategy has been successful, or not,
in supporting partners and beneficiaries having improved their own situation and abilities as per
the contribution of the intervention.
109. Following the results chain, where intermediary outcomes have been fully met the analysis focuses
at demonstrating how these have been contributing to the attainment of the intervention outcome 70
measured by some form of proven improvement in the international competitiveness of SMEs,
directly or indirectly targeted by the intervention, as well as in terms of impact 71, evidenced by
partners and beneficiaries having further built on the intervention support to improve other actors’
abilities and situation, at a wider level within their own area of influence. As mentioned earlier, the
evaluation focuses on the measurement of impact expressed in terms of ITC’s contribution to the
SDGs. For intermediary outcomes that have not been attained, the report still shows what progress
has been made towards achieving them and how they have contributed to the attainment of the
overall goal of the intervention.
110. The aim of any intervention is to deliver lasting benefits, so the sustainability 72 of the intervention
is covered in the Evaluation Report. The discussion focuses on whether there is evidence that
intervention direct and indirect benefits will continue beyond the period of intervention assistance.
A key emphasis in the analysis is whether the intervention has strengthened institutional and
human capacity to continue the observed benefits.
Rating system
111. The harmonization of evaluation criteria and rating system allows ITC to expand evaluation
coverage of its operations, and to consolidate evaluation-based performance and results at the
corporate level. A six-point rating system has been designed to apply to ITC evaluations, as
described in Table 9 below. In evaluation reports, evaluators provide a qualitative justification for
the rating of each evaluation criterion, a quantitative measure based on the achievement of
planned targets set out in the logical framework, and a composite rating for overall project
performance based on consideration of the individual ratings. All ratings should be a round
number, with no decimal points. Should the object of evaluation not have a logical framework (i.e.,
a policy), only a qualitative assessment on each of the evaluation criteria, and an overall rating of
the project, is expected.
68
Output is defined as ’The products and services which result from the completion of activities within a development
intervention’. Source: United Nations Development Programme (2002). OECD/DAC Glossary of Key Terms in Evaluation
Results-Based Management and Proposed Harmonized Terminology, New York. Available from:
http://web.undp.org/execbrd/word/Final%20RBM%20terminology%2030%20May.doc
69
Results are defined as ’The output, outcome or impact (intended or unintended, positive and/or negative) of a development
intervention’. Source: Organisation for Economic Co-operation and Development (2010), op. cit.: 33.
70
Outcomes are defined as ’The intended or achieved short-term and medium-term effects of an intervention’s outputs…
Outcomes represent changes in development conditions that occur between the completion of outputs and the achievement of
impact’. Source: United Nations Development Programme (2002), op. cit.
71
Impact is defined as ’…long-term effects produced by a development intervention, directly or indirectly, intended or unintended’.
Source: Organisation of Economic Co-operation and Development (2010), op. cit.:24.
72
Sustainability is defined as ’The continuation of benefits from a development intervention after major development assistance
has been completed’. Source: Ibid: 36.
38
ITC EVALUATION GUIDELINES
112. When using the six-point rating system for an evaluation of an intervention with a logical framework,
a qualitative rating is given for each of the evaluation criteria, and a quantitative rating is given
based on the achievement rate of planned targets set out in the logical framework. Both qualitative
and quantitative scores are intended to be used to justify the rate.
113. The qualitative rating for each given evaluation criterion is based on considerations of different
elements of the criterion, as elaborated in the evaluation matrix. For example, the rating for
relevance should be based on a balanced consideration of the intervention relevance to ITC’s
strategic objectives, to the needs of clients (e.g. policymakers, TISIs, SMEs, other beneficiaries),
and to coherence and clarity of the design. Shortcomings in relevance may have to do with the
extent to which the project’s objectives, design, rationale, coherence, adaptability, or
implementation is inconsistent with partners’ and beneficiaries’ development needs and priorities,
and/or with ITC’s results framework. 74
114. The effectiveness rating may be based on an overall assessment of the achievement of each of
the project intermediate outcomes and impact. It is useful when discussing the effectiveness of
the project to provide a table based on the objectives found in the logical framework (i.e. impact,
outcome(s), outputs), and the associated indicators and targets, as well as a column discussing
the progress made on each component. This step is vital in conducting the quantitative
assessment to determine the quantitative rate (discussed below). Shortcomings in
effectiveness relate to the lack of achieving the project outcome, intermediary outcomes, and/or
73
Adapted from: Independent Evaluation Group, International Finance Corporation (2008). A Review if IEG’s Methodology for
Assigning Development Outcome Ratings. Technical Note Number 3. Washington, page 2. Available from:
https://wpqr4.adb.org/LotusQuickr/ecg/PageLibrary48257B910010370B.nsf/0/9213AC34699B70F148257B9D003D7AEB/$file/
Assigning%20DO%20Ratings%20-%20Nov08.pdf, and United Nations Industrial Development Organization (2018). Evaluation
Manual. Vienna, page 24. Available from: https://www.unido.org/resources/evaluation/evaluation-resources
74
Adapted from: Independent Evaluation Group, World Bank (2005). Harmonized Evaluation Criteria for ICR and OED
Evaluations, Washington. Available from: http://ieg.worldbankgroup.org/sites/default/files/Data/HarmonizeEvalCriteria.pdf
39
ITC EVALUATION GUIDELINES
outputs, and may have to do with the inability of the intervention strategy to support partners and
beneficiaries to improve their own conditions and actions as expected as expected. 75
115. The efficiency rating may be based on an assessment of overall financial performance, output
quality, cost-effectiveness, and timeliness of outputs and outcomes. Shortcomings in efficiency
may have to do with the extent to which the intervention failed to achieve (or is not expected to
achieve) the project outcome, intermediary outcomes, and outputs with the available resources, or
is unable to account for results. 76
116. The impact rating may be based on the assessment of the extent to which the project has made,
or is likely to make, a difference to the beneficiaries, the extent of attributable change, and/or any
intended or unintended effects (whether positive or negative). Shortcomings in achieving
impact may have to do with the inability of the intervention strategy to contribute to long-term
transformation of partners and beneficiaries in improving the conditions and actions of others, at a
wider level. 77
117. The sustainability rating may be based on whether the project results and benefits will be sustained
after the end of project funding, and/or the extent to which the outputs and results have been
institutionalized. Shortcomings in achieving sustainability may have to do with the inability of
the intervention strategy to enable partners and beneficiaries to take ownership of the results
achieved beyond the period of intervention. 78
118. The quantitative rating is a percentage calculated with the use of the number of completed targets
out of the total number identified in the logical framework. Table 9 provides the formula applied to
transform the results to the rating. In all cases, the ratings that have been determined for the
qualitative and quantitative criteria, and the overall project, must conform to the description
provided in Table 9, and evidence should be provided.
119. It should be noted that the rating for overall performance and results (including qualitative and
quantitative) should not be an arithmetic average of the individual ratings. Appropriate weight for
the project. In rating practice, it is necessary to check the consistency among ratings on different
criteria. For example, if a project was to be found ineffective, it would be unusual for it to have a
high rating for sustainability. 79
121. A high priority should be given to lessons learned. 80 This part of the report must deal with those
evaluation experiences and lessons that have broader applicability to other projects, programmes
or policies. Frequently, lessons highlight strengths and weaknesses in preparation, design, and
implementation that affect performance, outcomes and impact. Lessons should specifically refer
to the findings or the part of the report they are based on, and should not be stated as
recommendations, observations or a description.
75
Ibid
76
Ibid
77
Ibid
78
Ibid
79
Asian Development Bank (2006). Guidelines for Preparing Performance Evaluation Reports for Public Sector Operations,
Manilla, p 7. Available from: https://www.oecd.org/derec/adb/37965974.pdf
80
Lessons learned are defined as: ’Generalizations based on evaluation experiences with projects, programmes or policies that
abstract from the specific circumstances to broader situations’. Source: Organisation for Economic Co-operation and
Development (2010), op. cit.: 26.
40
ITC EVALUATION GUIDELINES
122. The report should cover specific experiences that are considered good practices, which are drawn
from the evaluation and have a broader applicability to other activities of ITC. The Evaluation
Report should identify what worked well and how it can be replicated. Very often approaches to
project implementation that work in one situation are not made known to the rest of the
organization. This part of the report should identify these good practices so that they can be widely
shared within ITC.
123. The report should highlight major constraints and problems that have impacted the implementation
and delivery of the intervention. The aim here is to learn from these constraints and problems and
avoid them or find solutions to improve performance.
Conclusions
124. The report must draw conclusions based on all the above (findings, outcomes, lessons learned,
recommendations, etc.). Evaluation conclusions point out the factors of success or failure in a
project, with special attention paid to the intended and unintended results and impact, as well as
other strengths or weaknesses. A conclusion draws on data collection and analyses undertaken
through a transparent chain of arguments. 81 There must be a clear link between conclusions,
findings, and recommendations.
Recommendations
125. This part of the report should provide clear and pragmatic recommendations aimed at enhancing
the quality of interventions. Recommendations are proposals aimed at enhancing the
effectiveness, quality or efficiency of a development intervention; and at redesigning the objectives
and/or the allocation of resources. 82 They must derive directly from one or more specific
conclusions; they should aim at improving or reforming the project or preparing the design of a
new intervention. Recommendations should be clustered with strategic considerations and should
be operational and feasible. The conditions of implementation should be specified as well
• Be sensitive in the choice of words (e.g. use words like should or must to express
advisability or necessity).
• Be firmly based on evidence and analysis (not be opinion-based), and logically follow from
the evaluation findings and conclusions.
• Be formulated with clear priority actions and their use in mind, reflecting an understanding
of ITC’s organizational context and potential constraints to follow-up.
• Be clear on who needs to implement them (both for priority actions and oversight
responsibility), and clearly identify the target group for each recommendation.
• Be action oriented (human, financial and technical resource implications outlined), without
being overly prescriptive.
• Leave room for fine-tuning the implementation approach, while remaining balanced and
impartial.
• Be relevant to the purposes of the evaluation and, once drafted, be presented to relevant
stakeholders for further refinements, as appropriate.
81
Organisation for Economic Co-operation and Development (2010), op. cit.: 18-19
82
Ibid.: 32.
83
Adapted from: United Nations Evaluation Group (2017). Improved Quality of Evaluation Recommendations Checklist, New
York. Available from: www.unevaluation.org/document/download/2680
41
ITC EVALUATION GUIDELINES
127. Many issues come up during an evaluation, and generally some of these issues are resolved by
Management during the evaluation. This part of the report should provide a short resume of all
such issues, as it demonstrates how evaluation results are already being taken on board. These
issues should not be included in the section regarding actions and decisions recommended.
128. The Evaluation Report should clearly show the major proposals and suggestions that are made,
which aim to improve programme and project delivery, management or policy change. Some of
the recommendations may urge management to make certain decisions or take certain actions.
For each recommendation, a person or entity is identified as responsible for its implementation.
129. During the process of formulating recommendations, project management (and other likely
implementers) should be given the opportunity to review draft recommendations and provide
comments or suggestions for refinement ― provided this step does not compromise the required
accountability.
• the credibility of the data and analysis used in the Evaluation Report, and justification with
a minimal level of bias;
42
ITC EVALUATION GUIDELINES
REPORT STRUCTURE
The report follows the ITC Evaluation Guidelines, is well structured, logical, clear and complete, and an
executive summary is prepared summarizing the key findings and recommendations.
Evaluation objectives and scope are fully explained, and a summary of the methodology provided:
- An explanation of the chosen evaluation criteria, performance standards or other criteria used by the
evaluators
- The methods followed by the evaluation are in line with ITC’s Evaluation Policy and Guidelines; the data
used in the report are reliable, balanced and comprehensive, and the analyses to form judgments are
credible and objective.
- A precise description of the methodology applied that clearly explains how the evaluation was
specifically designed to address the evaluation criteria, yield answers to the evaluation questions and
achieve the evaluation purposes
- The report presents evidence that adequate measures were taken to ensure data quality, including
evidence supporting the reliability and validity of data collection tools (e.g. interview protocols,
observation tools)
- Gaps and limitations in the data and/or unanticipated findings are reported and discussed
The report presents a clear and full description of the 'subject' of the evaluation, including:
- The context of key economic, social, political, demographic and institutional factors
- ToC / the expected results chain (inputs, outputs, and outcomes), and scale and complexity of the
project or programme
- Key stakeholders involved
- Implementation status of the project or programme, including significant changes (e.g. plans, strategies,
logical framework) that have occurred over time and explains the implications those changes for the
evaluation
Based on analysis of management performance and changes, if applicable, the performance of ITC and
main partners should be summarized.
- Conclusions present reasonable judgments based on findings and substantiated by evidence and
provide pertinent insights
- Each conclusion is referring to the specific analysis presented in the Evaluation Report
84
Note: The checklist can be used for both draft and final Evaluation Reports. Adapted from: United Nations Evaluation Group
(2010). UNEG Quality Checklist for Evaluation Reports, New York. Available from: http://www.uneval.org/document/detail/607
43
ITC EVALUATION GUIDELINES
- Conclusions lay the firm ground for making evaluation recommendations pertinent to the prospective
decisions and actions of evaluation users
- Lessons and good practices are presented based on analysis
RECOMMENDATIONS
- Recommendations are forward-looking, relevant to the subject, actionable and reflect an understanding
of strengths and constraints of ITC
- Recommendations are supported by evidence and conclusions, and each recommendation is referring
to specific conclusion points in the report
- Recommendations clearly identify the target group for each recommendation
131. Once the draft report is quality assured by the IEU, it should be sent to respective Divisions and
the delivery managers for comments, and the comments should be returned to the IEU within two
working weeks. The draft will then be shared with ITC management for review and comments
132. When ITC internal comments are addressed, the draft report may be shared with external
stakeholders ― including representatives of funders and client countries, and implementation
partners in project countries ― for review and comments. During the commenting period,
dedicated meetings may be organized with key stakeholders to clarify remaining concerns and
promote common understanding and evidence-based consensus building among partners. This
process can be an opportunity to prepare key partners to agree and follow-up on evaluation
findings and recommendations.
Independent Evaluations
133. The draft Evaluation Report should be circulated among all stakeholders for comments. At times,
it may be prudent to share the draft Evaluation Report with the Project Manager to offer a
preliminary opportunity to determine if there are any factual errors, omissions, or inconsistencies.
If possible, it could also be shared with other ITC managers and staff for peer review. The
commenting period ranges from two to four weeks depending on the scale of the evaluation.
During the commenting process, dedicated meetings may be organized with stakeholders to clarify
the findings and recommendations, discuss remaining concerns, and promote common
understanding and consensus among partners. Together these meetings represent an opportunity
to further engage partners’ agreement on evaluation findings and follow-up on lessons, good
practices and/or recommendations.
134. Audit Trails will be prepared by the IEU to indicate how the evaluation team has addressed the
comments of key stakeholders in the final Evaluation Report. An Audit Trail is based on a simple
matrix where stakeholders can identify the area in the draft report requiring clarification or factual
changes, and a note containing the specific issue being addressed (a sample Audit Trail template
can be found in Annex IV). When all comments have been collated, the evaluator(s) address each
issue and provide a comment to each. The Audit Trail should be filed for considerations of
accountability and reliability and be published as an annex to the final report for transparency
purposes. An Audit Trail serves several purposes:
• It demonstrates how the Evaluation Report has treated (i.e. either accepted or declined)
the comments obtained from various stakeholders in the draft report stage.
• Good quality Audit Trails support evidence-based consensus building among partners and
prepare key partners for implementing evaluation recommendations.
44
ITC EVALUATION GUIDELINES
Self-Evaluations
135. Review of the Self-Evaluation is conducted by IEU; it is a formal recognition of the quality of the
evaluation and makes it eligible for disclosure and communication. The review is conducted in line
with criteria described in the Quality checklist for evaluation ToR and Inception Reports (Table 7),
and the Quality checklist for evaluation reports (Table 10).
136. To uphold the quality standard, the evaluation deliverables should be reviewed and commented
on by the stakeholders and IEU. Concerning Self-Evaluation, the evaluation manager is
responsible for quality checks and soliciting comments and reviews on the draft deliverables. To
facilitate the former, it is recommended to use the two quality checklists (found in Tables 7 and 10,
as mentioned above). The quality checklist in Table 10 should be used to review the draft
Evaluation Report before sharing the Self-Evaluation with stakeholders for their feedback and
comments. Quality assurance aims to enhance professional standards, credibility, and utility. The
quality review should be relatively simple, brief and quick.
137. The IEU quality assurance process is an independent and critical review of the evidence, results,
and assessments of a Self-Evaluation, and in line with the quality requirements criteria (see Table
10 above). After review, the Self-Evaluations qualify for circulation among stakeholders and for
publication (including on the ITC evaluation website). IEU is not able to support the disclosure and
publication of Self-Evaluations that do not meet quality requirements.
138. The process is largely based on a review of the Self-Evaluation Report and, if necessary, cross-
checking among stakeholders. IEU conducts a quality review of the ToR and final report. The
review generally focuses on assessing (i) internal validity of the Self-Evaluation Report (the quality
of data, analysis, findings, and recommendations); and (ii) consistency among ITC’s evaluations
(appropriateness of evaluation methodology, ratings, and conclusions in comparison to other ITC
evaluations).
139. Once a Self-Evaluation is satisfactorily reviewed, the report is endorsed for use and disclosure
among stakeholders. The learning potential is amplified when Self-Evaluations are included in the
organizational evaluation learning loop. In other words, the lessons learned, and good practices
reported by the Self-Evaluation should be shared across Divisions.
140. After review, endorsed Self-Evaluations qualify for disclosure among stakeholders and for
publishing (such as on the ITC evaluation website). For presenting corporate-level performance
and results in the AESR, the results generated from Self-Evaluations are included (along with those
generated by Independent Evaluations and Funder-led Evaluations). IEU takes responsibility for
cross-checking the analysis and findings generated by Self-Evaluations and presenting more
inclusive cooperate-level results and impact.
141. Once the comments from stakeholders and peers are addressed, an Audit Trail should be prepared
by the Self-Evaluation Manager to indicate how the comments have been treated in the final report.
The Audit Trail note should be filed for accountability and reliability, and it is good practice to
include it as an annex to the Self-Evaluation Report. Further information on Audit Trails is provided
in above and a template is provided in Annex IV.
Evaluation Use
Independent Evaluations
142. As mentioned above, a communication plan is an integral part of the evaluation ToR and Inception
Report. The plan may be updated or modified during the later stage of the evaluation. Evaluation
communication activities should be conducted on a timely basis throughout the evaluation process,
and key stakeholders should be regularly involved and updated; this will help familiarize
stakeholders with evaluation concerns and prepare them to use the evaluation findings in later
stages. Communication may take various forms: for example, formal or informal, written or oral,
face-to-face discussion, telephone conference, or videoconference.
45
ITC EVALUATION GUIDELINES
143. Upon the completion of the Evaluation Report, dissemination should be done in a timely manner.
The IEU prepares various communication products, such as an evaluation communication note;
the dissemination of evaluation products is tailored to the client’s preferences. Evaluation reports
are published online on the ITC website.
Self-Evaluations
144. The Self-Evaluation Manager identifies the potential users at the outset of the evaluation and
prepares a learning plan to share the findings with staff and stakeholders. The learning plan should
prioritize consensus building and use of the evaluation results in new interventions and for other
projects. If the objective and issue(s) of interest in the Self-Evaluation are mainly for performance
improvement, the range of potential users could be narrow.
145. Communication activities should be arranged and conducted on a timely basis throughout the
course of the Self-Evaluation process, and key stakeholders should be regularly involved and
updated. Involvement of stakeholders helps the evaluation capture their expectations and
perspectives and prepares them for using the evaluation findings and recommendations in next
steps.
146. For presenting corporate-level performance and results in the AESR, incorporates the findings and
learning lessons generated through Self-Evaluations. IEU also maintains a repository of the Self-
Evaluations conducted.
Management response
147. Management Responses are usually only required for Independent Evaluations. In some
circumstances, however, a Management Response may be requested for a Self-Evaluation. The
below instructions pertain to Independent Evaluations.
148. Preparing a Management Response is a critical step for integrating evaluation-based learning into
new interventions and actions. Once the Evaluation Report is finalized, it is the responsibility of the
people and functions to which the recommendations are addressed: to prepare a Management
Response in consultation with IEU; and to submit a draft response to IEU for follow-up purposes.
In cases where evaluation recommendations are addressed to several people or entities, a focal
point should be nominated by management to coordinate the process. To facilitate the responsible
managers in preparing their response, IEU should discuss with them in advance, and address the
relevant comments, concerns, and perspectives of the delivery managers and stakeholders, before
finalizing the evaluation recommendations in the report.
149. The Management Response is comprised of a list of the recommendations and indicates if the
recommendations have been accepted, partially accepted or rejected (template is provided in
Annex V). If a recommendation has been accepted or partially accepted, a list of actions designed
to support the implementation of the recommendation is required. Also required is a timeline for
completion and the person(s) and/or entity(ies) responsible to carry out actions established. When
creating the actions, it is important to bear in mind that they should lead to the successful
completion of the recommendation. When recommendations are partially accepted or rejected, it
is expected that a justification is provided to support the decision. The Management Response is
quality checked by IEU and approved by senior management; the final version should be signed
off by the responsible officers. The final Evaluation Report, mainly the findings and
recommendations, together with the Management Response, may be presented to the SMC.
150. IEU collects the implementation status of all evaluation recommendations every six months (March
and September of each year), consolidates this information, and reports significant issues to the
SMC. The implementation status of evaluation recommendations should be updated periodically
by the responsible operational teams. IEU maintains a tracking system on the implementation
status of the evaluation recommendations that were endorsed by SMC; it follows up on the status
of accepted evaluation recommendations and reports the consolidated progress to the SMC
periodically. The consolidated implementation status is included in the AESR.
46
ITC EVALUATION GUIDELINES
Evaluation follow-up
Independent Evaluations
151. Independent Evaluations are used and followed-up, using an interactive process involving all
stakeholders. Based on the evaluation Management Response, management should integrate
evaluation results and recommendations into its policies and programmes. As mentioned in the
previous section, IEU systematically follows up on the implementation of evaluation
recommendations and presents periodic reports on their status. When analysing Monitoring Plans
and Evaluation Plans at the project design phase, IEU should check if similar projects have been
evaluated and the findings considered in project design.
152. Once the Management Response and Action Plan for all accepted recommendations have been
completed, this plan constitutes the baseline against which future progress is measured. The logic
of the Action Plan is that actions taken in response to recommendations should contribute to
expected results and the complete implementation of the recommendation within a given date.
The purpose of follow-up reporting is to promote organizational learning and accountability for
results. The manager responsible for the Action Plan must submit periodic status reports to IEU.
153. Management periodically undertakes an assessment of the extent to which the accepted or
partially accepted recommendations have been implemented, which status is revised by the IEU.
The review examines how evaluation follow-up has improved programme design, delivery, and
strategic policy development. The key findings and lessons learned are inserted into the AESR,
which is presented to the JAG.
Self-Evaluations
154. The Self-Evaluation Manager prepares a follow-up plan on the use of the findings and/or on the
implementation of the recommendations, and progress should be updated periodically. It is the
responsibility of the Project Manager to follow-up on the implementation of the Self-Evaluation
Action Plan and recommendations.
156. For the formal closure of project operations, a PCR is prepared by the responsible project team at
the end of implementation. The PCR is intended to encourage the project team to reflect on and
learn from project performance from an evaluative perspective. The PCR is a means to ensure
that project results are reported in line with the approved plans and that projects are closed in
compliance with ITC rules and procedures. 85
157. A PCR is often considered a form of Self-Evaluation, as it also addresses learning. ITC evaluation
criteria (as discussed above), and data collection methods (discussed above and listed in Table 3)
may be applied accordingly. For accountability, a PCR should describe the project design,
intervention logic, changes during the implementation period, planned and actual costs, planned
and actual outputs, planned and actual results (as objectively as possible), lessons learned and
any other relevant issues.
85
Note: As of May 2018, the PCR process is now integrated into the project cycle process in the projects portal.
47
ITC EVALUATION GUIDELINES
i. The ITC format, when the funders acknowledge the quality of ITC’s M&E system and
accept the PCR as equivalent to their own closure report; or,
ii. The funder’s format, as agreed by ITC; Project Managers may need to provide additional
information requested for ITC reporting purposes using the PCR template.
159. A draft PCR should be reviewed within the Division for quality enhancement; once it is cleared by
its respective Director, the responsible manager should transmit the approved version to IEU. IEU
does not formally validate the quality of PCRs, though it can provide advisory support to managers
to enhance quality.
160. Project Managers may opt to convey additional comments beyond those that are required for
completing the PCR. Should this be the case, these comments are strictly confidential, outside of
evaluation scope and for the use of the SPPG Chief only.
48
ITC EVALUATION GUIDELINES
LEVEL OF RISK
CRITERIA
High (3 points) Medium (2 points) Low (1 point)
Project delivered by
Project delivered by two Project delivered by one
Delivery complexity more than two ITC
ITC Sections ITC Section
Sections
ITC Trust Fund (W1) Bonus +3 points No bonus points No bonus points
Intervention developed
Intervention developed
Strategic partnerships important strategic No strategic partnerships
strategic partnerships
partnerships
Quality of Evaluation
Add 3 bonus points No bonus points No bonus points
Plan
Evaluation undertaken
Not applicable n/a n/a
by funder
49
ITC EVALUATION GUIDELINES
Access to informants The sampling process proves to be Decide whether a reduced sample size is
difficult. likely to provide statistically valid findings. If
not, apply another technique such as a
focused group discussion.
Lack or weakness of An information source proves to be If possible, extrapolate missing data and
data incomplete. cross-check with other sources.
Annex III: How to cope with some common data collection biases 87
ISSUE DESCRIPTION HOW TO COPE
Confirmation bias This risk is a threat to all data When subject to this bias, the evaluation
collection approaches. It results team and informants tend to focus on
from a tendency to seek out intended effects and systematically
evidence that is consistent with the overlook external factors unintended
intervention logic, rather than effects, negative effects, interactions with
evidence that could disprove it other policies, outside stakeholders,
alternative implementation options, etc.
In an Independent Evaluation process,
this bias is avoided by relying on
independent and professional evaluators.
Informants strategy Those who have a stake in the This bias will be reduced if a whole range
intervention may distort the of stakeholders is included in the data
information they provide, with the collection workplan and if various sources
of information are cross-checked.
86
Source: World Intellectual Property Organization (2009), op. cit.: 25.
87
Ibid.: 26.
50
ITC EVALUATION GUIDELINES
Unrepresentative sample This bias may be a matter of In this instance, the evaluation team
concern if the evaluation team should verify that the sample of surveyed
generates quantitative data informants is large enough and
through a questionnaire survey. representative of the population.
It should also be considered when
using secondary data obtained
from a questionnaire survey.
Question induced answers This bias is frequent in interviews This bias will be limited by having
and questionnaires. questionnaires designed and tested by
experienced professionals.
The way in which questions are
asked by interviewers or the Systematically mix positive and negative
interviewer’s reaction to answers questions to reduce empathy bias and
can generate a bias which is either question bias.
positive or negative.
Even the order of the questions in
a questionnaire may change the
substance of the answers.
Empathy bias Interviewees may not have a pre- This bias is prevented by relying on
determined opinion about the properly trained interviewers and
questions put to them. They try to evaluators.
make up their minds in a few
seconds when responding to the Systematically mix positive and negative
questions to reduce empathy bias and
interviewer or to the questionnaire.
While doing so, they may be question bias.
strongly influenced by the context.
Especially in the case of
interviews, the evaluation team
must create a friendly (empathetic)
atmosphere, at least for the sake of
achieving a high rate of answers
and fast completion of the survey.
The combination of the two
introduces a systematic positive
bias in the answers, which tends to
overestimate the benefits of the
intervention and to underestimate
the role of external factors.
Sample selection bias People who agree to be This bias can be controlled by
interviewed may not be undertaking a special qualitative survey
representative of the overall target on a few “non-respondents”, although this
audience. exercise brings additional costs.
51
ITC EVALUATION GUIDELINES
Name: 88
Title:
Email address:
This feedback form has been designed to capture your questions or comments in a structured, focused
and readily usable way. It will enable you to direct your questions or comments to specific paragraphs
of the draft Inception Report.
Identifier
(Please insert relevant
paragraph number, page Question / Comment
number, annex number, or
other reference identifier)
Audit Trail 89
88
When comments and feedback are collated and incorporated as an annex to the Evaluation Report, all identification data and
contact information is removed.
89
It is important that each of the comments listed in the feedback form are addressed in the audit trail.
52
ITC EVALUATION GUIDELINES
Evaluation Title:
Date:
Signature:
Responsible manager:
Date:
Signature:
Responsible Chief/Director:
Date:
Signature:
Partner:
Recommendation 1:
Assigned to:
53
ITC EVALUATION GUIDELINES
The recommendation is: Please explain why the recommendation is accepted, partially accepted or rejected.
Accepted:
Partially accepted:
Rejected:
54
ITC EVALUATION GUIDELINES
Funders:
Signature / Date:
Signature / Date:
Signature / Date:
90
The PCR template is available on the ITC intranet under Project Management Guidelines, available from: https://our-
intranet.itc-cci.net/oed/sppg/ProjectManagement/SitePages/Project%20completion%20report.aspx
55
ITC EVALAUTION GUIDELINES
Important Note:
• Please liaise directly with the relevant DPS service(s) for financial and administrative reporting
requirements for project closure.
56
ITC EVALUATION GUIDELINES
Project logframe: Attach the project logframe from the project portal, as an annex to the PCR, including
the final narrative report in the portal (NPP-Results Tab – Results Monitoring Section).
Other achievements: Provide additional comments in case the project had results (positive changes
for beneficiaries, contributions to the SDGs) that were not planned in the logframe and therefore not
reported in the portal.
Yes No
1.1. SELF-ASSESSMENT
In case a PCR has been submitted to the funder and uploaded into the project portal, did this report
include a self-assessment section in which you explicitly commented on the following evaluation
criteria: 91 relevance, effectiveness, efficiency, impact and sustainability, lessons learned, conclusions
and recommendations?
If yes, simply do the rating for the categories below and specify in the text box on which pages you have
addressed the evaluation criterion in the PCR to the funder.
If answered no or not applicable, please complete all the sections below with your project performance
assessment. This assessment requires a narrative and a self-rating for each OECD/DAC evaluation
criterion.
Relevance: The extent to which the project was aligned with ITC’s results framework and suited to the
priorities and policies of beneficiaries and partners.
Narrative (insert text): Assess the consistency of the objectives of the project with ITC corporate
goals, the beneficiaries’ and partners’ needs, and the country’s development strategy and/or policy
priorities.
Provide your rating of the relevance of the project: level of satisfaction on the extent to which the
activities and outputs of the project were consistent with achieving the intermediate outcomes and
contributing to impact (only choose one in the scale below).
91
Organisation for Economic Co-operation and Development (2015), op. cit.
57
ITC EVALAUTION GUIDELINES
Effectiveness: A measure of the extent to which the project has attained its objectives
Narrative (insert text): Assess the extent to which objectives (outcomes and intermediate outcomes)
were achieved or are expected to be achieved. What were the major factors influencing the
achievement or non-achievement of these objectives?
Rate your level of satisfaction on the extent to which the objectives were achieved (only choose one in
the scale below).
Efficiency: A measure of the outputs (qualitative and quantitative) in relation to the inputs.
Narrative (insert text): Ascertain to what extent the project has converted its resources and inputs
(funds, expertise, time, etc.) economically into results.
Rate your level of satisfaction on the cost-efficiency of the activities (only choose one in the scale
below).
Impact: A measure of the positive and negative changes produced by the project, directly or indirectly,
intended or unintended.
Narrative (insert text): Describe long-term economic, social and environmental impacts and
contribution to the selected SDG targets produced by the project, particularly in terms of the partners’
and beneficiaries’ situation and actions.
Rate your level of satisfaction as to the real positive change that the project has made (only choose
one in the scale below).
58
ITC EVALUATION GUIDELINES
Sustainability: An assessment of whether the benefits of the project are likely to continue after the end
of the project.
Narrative (insert text): Indicate the likelihood of continued long-term benefits of the project. What were
the major factors that influenced the achievement or non-achievement of project sustainability?
Rate your level of satisfaction on the extent to which the benefits of the project will likely continue after
funding has ceased (only choose one in the scale below).
Lessons Learned: (those that can be applied to the next phase of the same intervention or to other
similar ongoing and future interventions)
Insert text: Lessons learned include what worked well and what did not. Please bear in mind that
lessons learned may have a broader application to other projects, programmes or policies.
Recommendations: (those that are specifically assigned to a responsible manager and team who
should implement them to enhance future performance)
Insert text: Recommendations should be based on lessons learned. They should be clear and
pragmatic, aimed at enhancing the quality of project design, the effectiveness and efficiency of
projects, and sustainability of results.
2. PROJECT FOLLOW-UP
List outstanding issues related to project closure, post-project handover and/or required follow-up.
Indicate the focal point at ITC and in the project country for future inquiries about the project.
Risk Management
Identify any risks that could affect future ITC projects or operations (e.g. conflict with a local
partner/individual, negative reaction to ITC due to a project incident or unmet expectations), and
persons and teams who will take on responsibility for monitoring them.
59
ITC EVALAUTION GUIDELINES
Records Management
Identify what arrangements have been put in place for the storage, security, and backup of project
documents (including consultant reports, minutes of meetings, monitoring records, and important
correspondence with funders, partners, and beneficiaries).
Administrative closure
Please review your grant Memorandum of Understanding and contact the Division of Programme
Support (DPS) for requirements related to the financial closure of the project (e.g. closure of petty
cash accounts, transfer of assets, closure of field office).
3. ADDITIONAL COMMENTS
Should there be any information that a Project Manager considers important but does not want
to share in the public PCR, please send these comments directly to the Chief, SPPG by e-mail.
The information will be used for learning purposes, but any messaging about these matters
will keep the source anonymous, unless agreed otherwise.
4. ANNEXES
60
Printed by ITC Digital Printing Service.
Street address P: +41 22 730 0111 Postal address
International Trade Centre F: +41 22 733 4439 International Trade Centre
54-56 Rue de Montbrillant E: itcreg@intracen.org Palais des Nations
1202 Geneva, Switzerland www.intracen.org 1211 Geneva 10, Switzerland
The International Trade Centre (ITC) is the joint agency of the World Trade Organization and the United Nations.