The Evaluation Department (EVAL) of the Danish Ministry of Foreign Affairs (MFA) commissioned Itad to conduct this first comprehensive evaluation of Danida’s humanitarian action since 1999. This synthesis report presents the main findings, conclusions and recommendations of the evaluation, drawing on case study interviews and data collection in South Sudan, Syria and Afghanistan, as well as interviews with Danida and its partners at headquarters level.
An Evaluation Reference Group (ERG), chaired by EVAL and consisting of representatives from relevant departments, including the Humanitarian Action, Civil Society and Personnel Advisors department (HCP) and the Technical Advisory Service (UFT), oversaw the evaluation. In addition, EVAL contracted a three-person expert panel as part of its quality assurance process. The members were Randolph Kent from the Humanitarian Futures Programme at King’s College, Sara Pantuliano, Director of the Humanitarian Policy Group at the Overseas Development Institute, and Ed Schenkenberg, Executive Director of HERE-Geneva. The panel’s role was to provide input into the evaluation process and help ensure that the evaluation is useful, relevant and of a high quality.
The evaluation team comprised Tasneem Mowjee (Team Leader), David Fleming, Erik Toft and Teresa Hanley.
The present Strategy for humanitarian action, launched in September 2009, sets out the overall objectives, key directions and priorities for Danish humanitarian action, and outlines the instruments that will be used to implement the Strategy. The Strategy stipulates that its implementation would be subject to an independent evaluation in 2015, in order to inform the formulation of a new humanitarian strategy. The MFA plans to develop the revised Strategy in 2015 and launch it in 2016 after the WHS so that the revised Strategy can reflect the outcomes of the summit. The MFA decided to begin the evaluation in 2014, with a follow-up phase in early 2016, to ensure that the evaluation contributed to the Strategy revision process and to improving Danida’s ways of working even before the end of the strategy period.
The evaluation has two specific objectives. These are to:
- inform Danida’s decision-making and strategic direction when it formulates its new Strategy for humanitarian action after 2015;
- document the results achieved through the implementation of the Strategy.
The second objective is important for helping Danida to understand whether its focus on implementing certain strategic priorities is delivering the desired results. It is particularly relevant because of the lack of a comprehensive and well-structured monitoring system within Danida to document and analyse the results of interventions (Kabell 2013: 8).
More specifically, the objective of the evaluation is to provide answers to six overarching evaluation questions specified by the Terms of Reference (ToR). Together these cover the seven criteria for the evaluation of humanitarian assistance – relevance, effectiveness, efficiency, impact, sustainability, coherence and coverage – stipulated by the Organisation for Economic Cooperation and Development’s Development Assistance Committee (OECD/DAC). The six questions are as follows:
- How relevant and flexible is the Danish Humanitarian Strategy given the changing humanitarian context since 2010?
- How relevant and effective has Danida’s engagement been in the international policy dialogue on humanitarian issues?
- What lessons can be drawn from relying on partnerships as the key implementing modality?
- How well does Danida support and ensure follow-up, monitoring and reporting of performance by partners, including ensuring reporting on the effects on affected populations?
- What are the lessons learned of linking emergency relief and development, that is, reconciling humanitarian and development objectives in specific contexts and settings?
- To what extent do the design, delivery and management of the Humanitarian Strategy align with the Principles and Practice of Good Humanitarian Donorship?
The evaluation scope is focused on gathering evidence of results across the entire period of implementation from 2010 to 2014. The evaluation covers both the policy and operational level in order to inform the development of the new strategy. At the policy level, the team reviewed the coherence and clarity of the strategic framework and its usefulness in guiding Danish international policy dialogue efforts as well as allocation and implementation decisions. At the operational level, the team assessed the implementation performance of partners and documented and analysed results across three selected case studies: two full field-level studies in South Sudan and Jordan/Lebanon (for the Syria crisis response), and one desk-based study of Danish assistance to Afghanistan. It has also drawn on lessons from other countries and crises where evaluative evidence exists.
The evaluation’s primary users will be stakeholders at the MFA and Danida’s implementing partners, while the Parliament and general public are likely to be secondary users.
As described in Annex B, Danida has adopted an innovative approach to the evaluation by commissioning the main evaluation before the end of the Strategy’s implementation period and then commissioning a follow-up on implementation of recommendations from this phase of the evaluation. The follow-up phase will take place in January-February 2016 to follow up on any changes to the management of humanitarian assistance. The team will conduct a participatory stakeholder workshop at the end of the follow-up phase that should contribute both to the finalisation of the revised Strategy and to Danida’s preparation for the World Humanitarian Summit that will take place in Istanbul in May 2016.
This synthesis report presents the main findings, conclusions and recommendations at the level of Danida’s global strategy. It is based on document reviews and interviews conducted by the evaluation team at both the global level as well as through two in-depth field-level case studies (South Sudan and Syria crisis) and one more limited desk study (Afghanistan). These three case study reports are included as annexes to this report.
Chapter 2 of this report sets out the evaluation approach and methodology, including the main challenges and limitations. Chapter 3 provides an overview of Danida’s Humanitarian Strategy and funding. Chapter 4 presents the main findings of the evaluation against each of the six overarching evaluation questions and 19 sub-questions. Finally, Chapter 5 sets out the evaluation’s conclusions and recommendations.Top