Royal danish ministry of foreign affairs - Go to the frontpage of um.dk   Publication  
 
 
     
 
 

Annex A Abbreviated Terms of Reference

1. Background

In order to assess Danida’s support to development research over recent years, as well as to provide recommendations which will feed into the current process of formulating an overall strategy for this support, Danida’s evaluation department has decided to commission an external evaluation. So as to allow the evaluation to pursue an adequate level of analytical depth it has been decided that the thematic scope should be limited to support for research within agriculture and natural resource management. This thematic area has received substantial support and represents an area of strong interest to Danida. The period covered will be 2006 to 2011.

Danida has supported development research within a range of subjects for several decades. Over recent years, annual funds have ranged between DKK 200 and 285 million, which have been disbursed through various channels, including the Consultative Research Committee for Development Research (FFU), research networks, and centre contracts with KU-LIFE.

The research support has not previously relied on a formal strategic framework (a such is currently under preparation), but has been guided by recommendations provided in the so-called Hernes report of 2001, in which an international panel examined Danish development-oriented research. One of the key recommendation was to link support to research closer to priorities in Danish development cooperation, and as a consequence, over the last decade, efforts have been put into ensuring that support to development research has taken shape in line with the International Development Cooperation Act (Lov om Internationalt Udviklingssamarbejde), development policy priorities, other Danida strategies, as well as to ensuring that it conforms to the Danish research grant system in general. In 2012, an overall objective for Danish support to development research was formulated for the first time in the new International Development Cooperation Act (§7), declaring that it should strengthen research capacity in partner countries and create new knowledge capable of alleviating development problems.

Support to development research is distributed primarily through account §06.35 on the Danish Finance Act. Sub-accounts correspond to “research and information activities in Denmark” (§06.35.01) and “international development research” (§06.35.02), and these are further subdivided into more specific channels, each with a specific purpose. An overview of the specific channels, as presented in the Finance Act of 2012, is given in Figure 1 below. Channels to be covered by the present evaluation are listed in bold type.

Figure 1 Overview of channels of support for development research
Four-digit account Six-digit account Eight-digit account
§06.35
Research and
information
activities
§06.35.01
Research and
information

activities in

Denmark
§06.35.01.10 Projects in Denmark
§06.35.01.11 Research activities
§06.35.01.13 Information activities
§06.35.01.14 Intercultural cooperation
§06.35.01.15 Fact-finding activities (minor studies)
§06.35.01.17 Seminars, courses, conferences etc.
§06.35.01.18 Evaluation
§06.35.02
International
development
research
§06.35.02.10 The consultative group on international
agricultural research (CGIAR)
§06.35.02.11 Other international
development research

Note: Channels to be covered by the evaluation are listed in bold type.

Of the channels presented in Table 1, §06.35.01.10, §06.35.01.11, §06.35.01.15, §06.35.02.10 and §06.35.02.11, are most directly related to production of research and Table 2 gives an overview of commitments made through each of these channel over the period from 2006 to 2011. Together §06.35.01.10 and §06.35.01.11 constitute the overall frame for support to research on development related topics and capacity building. The purpose of the former channel, §06.35.01.10, is, according to the Finance Act of 2012, to support the use of Danish competencies in development research, e.g. through establishing long-term partnerships between institution in Denmark and institutions in partner countries. Specific modalities over the period to be covered by the present evaluation have included support to research networks and to research centres at the Faculty of Life Sciences at the University of Copenhagen (KU-Life), and from 2011 support to the “Building Stronger Universities” initiative.

Until 2007, six thematic research networks received Danida support (see Figure 2), but in 2007 the research networks for environment (RENED), agriculture (NETARD) and governance, economic policy and public administration (GEPPA) were merged into one network – the Danish Development Research Network (DDRN) and support to the NPSD Poultry network was discontinued. The last commitments were made to the DDRN, the Danish Water Forum and Research Network for Health in 2009 (totalling DKK 14 million), and covered a final period ending in June 2011.

Support has been provided to three research centres located at Copenhagen University (KU-LIFE) until 2012. These were continuations of support provided to earlier centres, e.g. Danida Forest Seed Centre and the Danish Bilharziasis Laboratory, as these were brought under a new institutional organisation with KU-LIFE. From 2011 this support was replaced by a the new modality, “Building Stronger Universities in Developing Countries”, which cooperates with “Universities Denmark” (an organisation of eight Danish universities). It comprises four thematic “platforms” – ’environment and climate’, ’growth and employment’, ’human health’, and ’stability, democracy and rights’.

Finally, since 2011, DKK 10 million a year has been given to the three-year-long international research and communication programme ReCom (co-financed by Sida and coordinated by UNU-WIDER), with the objective of summing up and communicating existing knowledge about the forms of aid that work within each of the five areas: growth and employment; environment, energy and climate; good governance and conflict and stability; social development; and gender equality. ReCom will not be covered by the present evaluation.

The purpose of funding through §06.35.01.11 is to support the development of new knowledge for the benefit of developing countries and to maintain and expand the capacity of these countries with respect to producing research. It is a requirement that supported research involves cooperation between research environments in developing countries and in Denmark, and that it contributes to strengthening research capacity in the developing country. Grants are awarded on a competitive basis after annual calls for applications within priority themes of relevance to Danish development cooperation and to the needs in partner countries. The Consultative Research Committee on Development Research (FFU) evaluates the applications according to three equivalent criteria: the quality, relevance, and potential effect of the research. The FFU assesses which applications are worthy of support and forward their recommendation for formal consent by The Danish Council for Strategic Research (DCSR). The Ministry of Foreign Affairs makes the final decision on which applications to be supported among the applications worthy of support. While North-South collaboration is a requirement, the main applicant must generally be affiliated a Danish research organisation. Exempt from this rule are grants financed through the Pilot Research Cooperation Programmes (PRCPs) that were launched in Tanzania and Vietnam in 2008 and in Ghana in 2011, in an effort to make research more South- and demand-driven. In the PRCPs local researchers draw up concept notes (within the framework of Danish development cooperation) and select Danish researchers for cooperation, after which they jointly prepare the definitive research projects. The PRCP is further set apart from other funding through §06.35.01.11 in that priority themes are defined in framework agreements with the programme countries, and thus not on an annual basis. Individual projects under these programmes must otherwise meet the same requirements as other research cooperation and are approved through the same process mentioned before. The Danida Research Portal provides[88] a detailed overview of projects supported through §06.35.01.11.

Funding through §06.35.01.15 is for minor studies of a “fact-finding” nature, implemented with the main objective of strengthening the quality of Danish development cooperation.

In preparation for this evaluation, an Issues Paper has been developed, which gives an overview of central issues and challenges of Danida’s research support within agriculture and natural resource management themes. As part of this exercise, an overview of more specific elements, objectives and goals of modalities to be covered under §06.35.01 has been prepared, and this is annexed to these ToR[89]. Funding through §06.35.01.13, §06.35.01.14, §06.35.01.17 and §06.35.01.18 will not be covered.

Funding through §06.35.02 will also not be covered by the evaluation. It comprises support for international organisations that bring about research result, provide advisory services, and carry out education and capacity development for the benefit of developing countrieswithin various fields or agriculture, health and social sciences. For some years, there have been efforts to focus this support, with a narrowing down of the number of partner institutions, while awarding each institution multi-year grants. Within the area of agriculture, Danish support to international research is primarily directed at the CGIAR system, which undertakes research focussed on increasing agricultural production (including in view of climate change) in developing countries. “Other international development research” comprises support to institutions that conduct research within health and social sciences.

In addition to the funding through §06.35, Danida is responsible for a series of grants for other types of development research, which will not be covered by the present evaluation. This includes the Universities, Business and Research in Agricultural Innovation (UniBrain) programme (DKK 30 million in 2010 and DKK 99 million for the period 2012/2015), which aims to foster innovative solutions and products, as well as to strengthen the role of the research communities and of universities in agriculture and agro/industry, and hence economic growth and employment. Further, a series of Danish institutions, including DIIS, Danish Institute for Human Rights and the Rehabilitation and Research Centre for Torture Victims (RCT) receive core funding contributions from Danida, part of which finances research and analysis, as well as participation of researchers in Danish and international debates and academic networking. Likewise, Denmark contributes core funding to a series of UN organisations and to the World Bank, as well as a great deal of earmarked allocations to these organisations’ research activities. And finally, there are research components in certain sector programmes, such as the environmental programme in Bolivia, the business and agricultural sector programmes in Vietnam, the budget support programme in Mozambique, and the transitions support programme in Bhutan.

Figure 2 Commitments by funding instrument, support for development research
Commitments by funding instrument (million DKK); with budget codes of the Finance Act 2006 2007 2008 2009 2010 2011 Sum
1. FFU (§06.35.01.11) 96.7 96 132 167 133 91 715.7
 1.1 Competitive Research Grant 96.7 96 112 147 104 71 635.7 **
Operating costs DFC and FFU, total         9    
1.2 Pilot Research Projects
(Vietnam, Tanzania, Ghana)
    20 20 20 20 80 **
2. Projects in Denmark (§06.35.01.10) 47.1 54.7 37.4 52 32.2 106 329.4
 2.1 Building Stronger Universities (BSU)       5 3 60 68
 2.2 Research Networks 8.6 16.2   14     38.8
2.2.1 Danish Water Forum 1.5 3         4.5 *
2.2.2 Research Network for Health 0.8 3.2         4
2.2.3 GEPPA (Governance Economic Policy, Public Administration) 1.3           1.3
2.2.4 ReNED (Environment) 1.2           1.2 *
2.2.5 NETARD (Agriculture) 1.4           1.4 *
2.2.6 NPSD Poultry Network 2.4           2.4 *
2.2.7 Danish Development Research
Network (DDRN)
  10         10 **
2.3 Centres at Copenhagen University
(KU-LIFE)
38.5 38.5 37.4 33 29.2 36 212.6
2.3.1 Seed Health Centre 9.5 9.5 11 8 6,8   44.8 *
2.3.2 Forest, Landscape and Planning 6 6 6 6 5.2   29.2 *
2.3.3 Institute for Health Research and Development 23 23 20.4 19.0 17.2   102.6
 2.4 ReCom           10 10
3. Minor Studies (§06.35.01.15) 10 10.2 6.3 6.2 11 11 54.7**
4. International Agricultural Research (CGIAR) (§06.35.02.10) 38 36 35 35 35 35 214
5. Other International Development Research (§06.35.02.11) 12 16 15 15 25 25 108
Sum 203.8 212.9 225.7 275.2 235.2 268 1422

Sources: Issues Paper prepared for the evaluation (figures for 2010 and 2010 updated according to “Forskningsredegørelsen 2011”. To be included in the evaluation: *All projects (all agriculture and NRM related); **Only projects related to agriculture and NRM.

2. Evaluation Purpose

The dual purpose of this evaluation is to:

  • Assess, document and explain the relevance, effectiveness and efficiency – and where possible sustainability and impact – of Danish support to development research within the thematic areas of agriculture and natural resource management. Emphasis should be on identifying core elements of importance for advancing the support and its results, and by implication both support channels, individual projects and important conditions and processes that frame the research (e.g. criteria that supported activities must live up to, follow-up, incentive structure etc.) must be analysed.

And on the basis hereof,

  • Provide lessons learned and recommendations which may feed into on-going discussions on how to improve support to development research, and more specifically into the current process of developing an overall strategic framework for support to development research, which is expected to be published in May 2013.

3. Scope of work

Evaluation period

The evaluation will mainly cover research projects which have been approved and for which commitments have been made between 2006 and 2011 (both years included).

Projects approved before 2006 will not be covered, even if implementation extents into the period covered by the evaluation. An exception should be made for projects approved between 2006 and 2011 that have received funding in several phases, where it will be necessary to look also at phases initiated before 2006.

While this time frame has been chosen so as to allow for tracing and assessment of processes involved in supported activities covered by the evaluation, it is clear that possibilities for assessing achieved outputs and outcomes will vary with the time of initiation of the activities. Thus, for example, research projects approved in 2011 can largely not be expected to be achieving outcomes yet, and criteria and methods for assessing them will need to take this into account (e.g. by focussing of relevance and efficiency, and assessing effectiveness primarily through evaluation of the plausibility of assumed causal chains).

Thematic focus

The thematic focus of the evaluation will be on ’agriculture and natural resource management’. This should be interpreted broadly to also include, for example, veterinary science, aquaculture, climate change, land use planning, and ’green’ environment. An overview of the relevant research projects funded between 2006 and 2010 is provided in the Issues Paper developed in preparation for this evaluation, but will need to be updated to cover also 2011[90].

Coverage

The evaluation must include analysis and assessment of support to development research within agriculture and natural resource management both overall and at the level of individual projects and modalities. Further, an assessment of the relative merits/complementarities/etc. of different modalities is expected. In order to do this, the evaluation should not only look at initial objectives and achieved results, but also on the processes linking and shaping the two, so as to be able to identify forward-looking lessons on what can be done, how and why, to enhance the merits of the development research support.

The modalities to be covered are:

  • The competitive research project grants


  • PRCP projects


  • The research networks relevant to the theme of the evaluation: Danish Water Forum, ReNED (Environment), NETARD (Agriculture), NPSD (Poultry network) and DDRN (Development research).


  • The research centres (institutional framework contracts) at KU-Life. Relevant to include are: “Danish Seed Health Centre” and ’Centre for Forest, Landscape and Planning’.


  • The BSU initiative


  • Fact-finding activities (Minor studies)

While the entire portfolio of relevant research projects (competitive and PRCP) is to be covered descriptively (the Issues Paper identified 92, but this list will need to be updated to include 2011), more in-depth assessments are expected of a selection of projects. The majority of these projects will be from Burkina Faso and Tanzania, where field visits are to be made, however a fraction of support in other countries will also be subjected to desk-based in-depth analysis. Criteria for selecting these projects have to be developed by the evaluation team.

Establishing the relation between the in-depth analysis of the selected projects and processes on the one hand, and the descriptive analysis of the overall portfolio on the other, will be important, and the inception report should include reflections on how this will be ensured.

Given its novelty, the BSU initiative cannot be covered comprehensively by the evaluation. However, it will be required to assess the set-up of the BSU initiative, including preliminary experiences.

4. Evaluation Questions

Evaluation questions will primarily be centred around the OECD/DAC evaluation criteria of relevance, efficiency and effectiveness. Considering the short lifetime of a number of the projects to be covered, it is likely to be too early to assess impact and sustainability in some cases. However, where possible (e.g. in the case of research projects that have undergone several phases and centre contracts) these criteria should also be applied. It will be important to keep in mind, as described in the introduction above, that different support channels have different purposes, and consequently will to some degree need to be assessed against different objectives. Likewise, not all the below listed evaluation questions will be equally relevant for all modalities. At the same time, the evaluation will be required to contain an element of comparison of the relative relevance, effectiveness and efficiency of different categories of support, e.g. North- vs. South-driven, funds subject to competition vs. centre contracts and, for FFU projects, small (e.g. PhD and post doc grants) vs. larger strategic projects. In undertaking such comparisons, it will be important to compare over a sufficient range of issues to be able to highlight the different merits of different categories, and determine whether there are likely to be trade-offs implied in prioritizing certain categories over others. Further, to meet the objective of contributing to learning and in order to ensure a basis for recommendations for future support, the evaluation should strive to explain how and why support has achieved the results it has (in terms of relevance, effectiveness, efficiency and where relevant impact and sustainability).

Relevance

As point of departure, the criterion of relevance relates to the extent to which the objectives of an intervention are consistent with beneficiaries’ requirements, needs, overall priorities and partners’ and donors’ policies, strategies etc.

In the present case, relevance will need to be assessed at different levels, and questions to be answered will include, but not necessarily be limited to the following:

  • Is the composition of the portfolio of Danish support relevant in view of partner country policies and strategies?


  • Is the composition of the portfolio of Danish support relevant in view of Danida policies and strategies?


  • Are the objectives of specific projects/networks/centres/studies relevant in view of knowledge gaps and/or information needs and, where relevant, research capacity needs within the substance area and countries addressed?


  • How and to what degree is the relevance of the overall portfolio of projects and mix of modalities continuously ensured (e.g. by responding to changing Danish or partner country priorities etc.)?


  • How appropriate is the procedure for formulating research themes guiding FFU annual calls for applications in terms of ensuring relevancy, particularly in terms of usefulness (to partners and/or Danida)?


  • How appropriate and relevant are the different channels and modalities, individually and compositely? Where relevant, this assessment should be made in view of the principles set forth inter alia the Paris Declaration on Aid Effectiveness (particularly alignment and ownership), and in terms ensuring that research is demand-driven.


  • How appropriate are the different strategies for ensuring and enhancing relevance to end users, both in terms of content, partner involvement, dissemination etc.?

Effectiveness

As point of departure, the criterion of effectiveness relates to the extent to which the intervention’s objectives were achieved, or are expected to be achieved taking into account their relative importance. In the present case, this links to assessing the delivered outputs and, to the degree possible, outcomes – in terms of research/knowledge produced, disseminated, put to use etc. and research capacity built – for individual projects and to the degree possible also at portfolio level.

It will be important to assess effectiveness in terms of the stated objectives of Danida support to development research, including the specific objectives of individual modalities, as well as the specific objectives of individual projects. The assessments are, as far as possible, expected to include analysis of the additionality of Danida support.

It should be noted that the assessments of the different types of levels of results must consider the processes and procedures that has defined the achievements, so as to be able to identify the important points and lessons. The evaluation is expected to include the following:

  • To what extent have planned outputs and objectives of individual research projects, special studies, centres or networks been achieved?


  • What is the quality level of produced research?


  • What is the nature of the collaboration established between Danish and developing country research institutions? Has Danida support been decisive in establishing this collaboration? Might different forms of collaboration have emerged in the absence of Danida support?


  • Has sufficient attention has been given to building research capacity (at the level of individuals/institutions/country where relevant), and have such efforts have been successful?


  • Have sufficient efforts been made to disseminate research findings, and have findings been communicated in a way that promotes use in partner countries, internationally and/or in Danida?


  • How and to what degree have research findings been put to use – internationally, in Danida programmes and/or by national partners?


  • How appropriate/plausible are assumed causal pathways from research and/or capacity building outputs over behavioural outcomes to wider impacts?


  • How adequate are the monitoring systems and follow up requirements imposed by Danida?


  • How (if at all) is coherence and synergy between different channels and modalities (competitive grants, PRCP project grants, research centres, networks, BSU) ensured?

Efficiency

The criterion of efficiency can be seen as a measure of how economically resources/inputs (funds, expertise, time, etc.) are converted to results. It is not expected that a full cost-benefit analysis can be carried out. However, it is expected that the evaluation will assess if resources have been put to good use, as well as the strengths and weaknesses of different types of activities and also differences between different modalities of support (i.e. the South-driven pilot projects versus the North-driven larger strategic FFU projects, or activities under the Centre-contract). The evaluation is expected to include the following:

  • What is the level of resources employed in the process of administering and monitoring support, and is this appropriate? This will include a comparison between different channels / modalities.


  • Has division of labour between the FFU, UFT, the Danida Fellowship Centre, the embassies and partner institutions been appropriate – for individual modalities and in terms of the mix of modalities?


  • Have projects been undertaken as planned, i.e. followed their time schedule, using the resources planned, delivering outputs as planned.


  • Has sufficient attention has been given to ensuring harmonisation with other donors’ support?

Impact and Sustainability

Impacts and sustainability can be seen as being interrelated in the sense that impacts relates to the wider and longer-term effects, and sustainability to whether effects and achievements will be sustained over time. Impacts of support to development research include the positive and negative changes produced by the support (i.e. research and capacity building projects), whether they are direct, indirect, intended or unintended.

In relation to support to development research, this could be translated into the following questions:

  • What differences has the cooperation, capacity building and/or research projects made to the participating researchers and the participating institutions?
  • What differences have the findings and/or the application of the results had with the field of research/the sector/the local end users?
  • Who has benefitted from the improved research capacity (direct beneficiaries such as those researchers who have been trained, as well as the indirect beneficiaries, such as their students) and how?
  • Who has benefitted from the application of the findings to help solve development problems and how?

It is not expected that the evaluation will be able to address these questions systematically. However, it will be expected to explore the possibility of conducting example-based impact assessments, exploring the questions for a few multiple-phased projects, with due consideration of the limitations of such examples.

5. Methodology

The evaluation will be carried out in accordance with the Danida Guidelines for Evaluation (Danida, 2012) and the OECD/DAC Evaluation Quality Standards (2010). As part of the preparation of the evaluation, the Evaluation Department has commissioned a study of recent evaluations of likeminded donors’ support to development research, which will provide an overview of objectives and methodological approaches. The contracted evaluators may use this study as well as their expertise on the subject matter and evaluation methodology to develop a suitable methodology during the inception phase. The evaluation is expected to assess different types and levels of results with consideration of the processes leading to them. This will pose substantial requirements on the collection of data, both in terms of quantity and quality. By implication, it is expected that the evaluation will need to build on different analytical methods, and utilise different types of information, separately and in combination, to answer the different questions. As a minimum, the methodology should incorporate the principles, elements and considerations outlined in this section.

It must entail a combination of desk studies, bibliometric analysis and primary data collection in Copenhagen and in the field study countries, as well as distance interviews with stake-holders in other countries, and is expected to be based on both quantitative and qualitative methods of analysis. Triangulation of both data and methods is of core importance, and the inception report must describe how this will be ensured. The following elements will be required in the overall methodology:

  • A literature review to place the evaluation into the context of development cooperation, and of the research institutions in Denmark and in the partner countries. It will include a review of documents related to Danida support to development research, policies and priorities of Danish development cooperation as well as partner countries, and other donors’ experiences and policies regarding research support. The Issues Paper and overview of relevant evaluations developed in preparation for this evaluation may guide part of this review.


  • Quantitative description of the outputs of supported research projects, fact finding activities (minor studies), networks and centres, in terms of publications, articles in scientific journals, books, conference papers etc., as well as “second rank outputs” such as patents. Much of this information will be available from monitoring information and is summarized in the Issues Paper prepared for the evaluation (though this does not cover 2011). However, it is expected that it will be necessary to supplement with a survey administered to representatives of the supported projects etc, including in order to identify outputs superseding project completion. A draft format of this survey must be included in the inception report.


  • Bibliometric analysis on publication and citation, so as to provide an input to the assessment of knowledge production and pick-up.


  • Quantitative description of outputs related to capacity building, including number of master degrees, PhDs and Post-Docs financed, number of local staff trained (e.g. lab-technicians, teachers etc.), course-material developed, upgrade of physical research facilities, etc. Again, this information is available in part through monitoring information and summarized in the Issues Paper, but will need updating and supplementation with a survey (e.g. to assess the degree to which e.g. PhDs and Post-Docs continue their academic careers).


  • Survey administered to stakeholders in contact with the supported centres and members of supported networks as well as recipients of competitive research grants (in Denmark and in partner countries).


  • Interviews with key informants including Danida staff, partner country authorities, FFU, DFC, research institutions and individual researchers (in Denmark and partner countries) as well as the intended users of the research outputs. These interviews will be conducted with the aim of validating and substantiating quantitative findings, as well as to assess effects and issues related to relevance, efficiency, sustainability and impacts that are best addressed qualitatively. Particularly the processes that frame supported activities (selection criteria, follow-up, incentive structures etc.) and issues related to relevance and quality of outputs and outcomes (research and capacity building) will primarily need to be addressed through interviews.


  • Field studiesin the two selected countries. At least 50% of the supported projects in the field study countries will be assessed in depth, and evaluators should present a well-founded proposal for selection criteria, as well as considerations regarding the implications of the proposed selection. It is important that the selection ensures a certain level of representativeness as well as possibilities for drawing lessons, e.g. regarding successful practices, for future support. In addition to assessing the specific research projects field studies should explore how national institutions are involved and gain from research cooperation through specific projects, networks and centres. In Tanzania, particular attention will be given to the demand-driven pilot research cooperation projects.


  • In-depth analysisof a further number of research projects (from non-visited countries), exploring issues related to quality and use of research results, as well as management and processes. Proposed selection criteria and projects meeting these criteria should be presented in the inception report. Again, it is important that the selection ensures a certain level of representativeness as well as possibilities for drawing lessons for future support. Further, it will be required that priority is given to larger projects including those that are in their second or third phase in order to gain the longer time perspective relevant for research impacts, and that at least two projects are PRCP projects (from non-visited countries).

As indicated, different mixes of data and methods of analysis will be required to assess activities of different purposes (or combinations of purposes). It is expected that the technical proposal for the evaluation will include proposed strategies and indications of methodologies for assessments with respect to the various previously mentioned purposes of the support. These methodologies will then have to be further developed in the inception report.

As the evaluation is expected to work with a combination of different analytical tools, rendering information with different levels of coverage and depth (i.a. in depth case studies and broader compilations of quantitative information), care must to taken to maximise the mutual benefit and value added of the information gathered. The analytical strategy is thus expected to consider how to use the information so as to enhance validation and triangulation throughout.

In addition to providing precise and internally valid conclusions regarding support as it has been implemented, it will important that the evaluation – through attention to the role of context (thematic, institutional, Danish, partner country etc.) – reflects on the likelihood that “lessons learned” will be applicable more widely to future Danish support to development research, and that recommendations reflect such considerations. In other words, in formulating recommendations, attention must be given to the likely extent of external validity of conclusions (or conditions under which conclusions are likely to carry over), and how this influences the applicability of the different recommendations, from the overall strategic level to the more specific levels, and for different actors, as may be relevant.

Comments to these ToR and a further elaborated proposal for the methodology, approach, work plan and organisation of the evaluation are part of the tender criteria for the selection the evaluation consultant.

6. Outputs of the Evaluation

The outputs of the evaluation include: i) an inception report, ii) an evaluation report and, iii) relevant process documents and data (an evidence data base).

An Inception Report in draft(s) and final version (not exceeding 30 pages excluding annexes), based on desk study and a first round of interviews in Copenhagen. The inception report shall include a comprehensive presentation of the context of the evaluation and a thorough description of the evaluation design. This will include project selection criteria and an outline of the content of projects selected, an evaluation matrix with indication of data sources and coverage, triangulation strategy, analytical methods etc., as well as implications of choices made. Descriptions of how and to what degree the analysis will allow the evaluation to assess both different areas and levels of support, different modalities, as well as the overall picture of the achievements of the research support, should be included. The inception report will also suggest if any changes to the evaluation questions are appropriate, and present a detailed work plan to facilitate the logistics of field work in advance. Further, an outline of the expected structure of the evaluation report must be included as well. The draft and final version of the inception report must be submitted to EVAL for comments and final approval of the latter.

An Evaluation Reportin draft(s) and final version (not exceeding 60 pages excluding annexes and with cover photo proposals). The evaluation report must include an executive summary of maximum 10 pages, introduction and background, presentation and explanation of the methodological approach and its analytical implications, and findings, conclusions and recommendations. Systematic referencing to gathered evidence should be made throughout.

The draft(s) of the evaluation report must be submitted to EVAL for commenting. As part of this process EVAL will invite comments from Evaluation Reference Group and other stakeholders. The evaluation report will be made public once printed by EVAL.

The full text of the ToR can be found on http://evaluation.um.dk/


[88] http://drp.dfcentre.com/

[89] Please find full ToR at http://evaluation.um.dk/

[90] Broegaard, R.B. (2012): ’Issues paper for future evaluation of effect of Danida supported research on agriculture and natural resources’ (plus data files).




This page forms part of the publication 'Evaluation of Danida supported Research on Agriculture and Natural Resource Management 2006-2011' as chapter 15 of 16
Version 1.0. 09-09-2013
Publication may be found at the address http://www.netpublikationer.dk/um/11214/index.htm

 

 
 
 
 
  Ministry of Foreign Affairs of Denmark © | www.um.dk