Appendix 1 – Terms of Reference
The Secondary Education Support Programme (SESP) came into effect in 2003 as a joint programme implemented under the aegis of the Government of Nepal with support from Asian Development Bank and Denmark with a total basket of 74.8 million USD over a sevenyear period.
The principal policy goal of the SESP is to strengthen the involvement of local communities in the running and funding of their own schools, with assistance from and under the supervision of the national government. The main vehicle for securing this greater involvement is the design and implementation of the School Improvement Plan (SIP).
The purpose of the SESP is to provide funding and technical assistance to achieve essential improvements in the quality of teaching, the curriculum and learning environments, while steadily building capacity at the central and local levels to take forward these improvements in the future and to fund from their own sources a higher level of recurrent and development expenditure.
The three main objectives of SESP are:
- Improvement in access and equity in secondary education;
- Improvement in the quality and relevance of secondary education; and
- Improvement in the institutional capacity to support a school-focused secondary education system.
The main outcomes the SESP aims to achieve are:
- To raise Gross Enrolment Rate (GER) in lower secondary from 55 per cent to 65 per cent, and in secondary from 35 per cent to 55 per cent by 2007
- To raise the participation of girls from 40 per cent to 50 per cent in both lower secondary and secondary education and to increase similarly the percentage of disadvantaged groups by 2007
- To raise and sustain measurable improvements in educational outcomes as evidenced by the grade 8 and SLC examinations; the numbers passing grade 8 should increase as should those passing SLC. Similarly proportionate increases should be achieved for girls and students from traditionally disadvantaged groups.
SESP is now nearing the end of its current phase of implementation and is going to be transformed into a sectoral frame within the School Sector Reform (SSR) from 2009 onwards. Although the SESP interventions will continue they will be part of the holistic school sector reform with focus on 1-12 school structure, comprising 1-8 basic education and 9-12 secondary education.
The evaluation of EFA has already begun and is expected to feed into the SSR Appraisal in March 2009. In this juncture, the EFA and SESP joint Mission held in November 2008 contemplated on the need for undertaking a separate SESP evaluation, along with the EFA evaluation, and agreed to carry out such an exercise, reflecting on the overall programme interventions and assessing the outcomes, paving the way for its continuation and integration within the SSR framework. The SESP evaluation is thus deemed crucial to draw lessons from the current programme and setting the stage for improving and expanding the secondary education programme across the country.
2 Evaluation Purpose and Objectives
The Ministry of Education (MoE) and the development partners agreed to the purpose and objectives of this evaluation during the EFA and SESP joint mission in November 2008.
2.1 Evaluation Purpose
The overall purpose of the evaluation is to provide information about the outcomes and document early signs of impact of the SESP that MoE, the development partners and other education stakeholders can use for improving the policy framework and further the design of the on-going SSR.
2.2 Evaluation Objectives
The overall objective to which this evaluation is intended to contribute to is an improved foundation for the design of effective secondary education reform measures and intervention strategies. This will be achieved through the fulfilment of five immediate objectives designed to identify, document and disseminate key outcomes and lessons learned from the SESP.
The five-fold immediate objectives of the evaluation are to:
- Assess the efficiency, effectiveness, relevance and sustainability of each of the four components of the SESP as well as the SESP as a whole.
- Evaluate the strategies, approaches and methods adopted by the SESP in achieving the three main objectives.
- Assess the performance of (1) the implementing partners in the implementation of the respective components/activities and (2) the schools in the use of goods and services provided.
- Analyze the anticipated and unanticipated, positive and negative impact of the SESP on the target groups.
- Enlist the lessons learned and best practices developed by the SESP, and elaborate forward-looking recommendations, which could contribute to policy development and future development of the secondary education in the view of the on-going SSR.
It is expected that the immediate results of the evaluation will be that MoE’s current work in sample districts (PID and non PID) is assessed documented and disseminated in a systematic way, and that key information on outcomes and lessons learned are received and processed as input for the SSR initiatives.
In the medium term it is also expected that the evaluation will contribute to improved evidence-based and results-oriented programming as well as increased visibility of MoE’s work. Through the involvement of the MoE’s M&E staff throughout the evaluation, it is also expected that the evaluation will contribute to a strengthened M&E capacity, in particular with regard to evaluations. In addition, the evaluation is expected to provide information relevant for both accountability and learning purposes for the development partners ADB and Danida.
3 Evaluation Framework
The evaluation will have to cover three broad and interlinked areas:
- study of the early impacts of the SESP interventions with regards to the target groups (including the participating students, teachers, Headteachers, School Management Committee members and school as a whole), and as compared to the main objectives and the 2012 vision described in the Core Document;
- assessment of the performance of the SESP covering relevance of the objectives, efficiency, effectiveness and sustainability; and
- evaluate the performance of the partners including ADB and Danida; but in particular the implementing partners at different levels such as the MoE, DoE, CDC, NCED, OCE, DEOs, schools in the Programme Intensive Districts (PIDs) and other entities involved in either delivering or receiving SESP products and services.
In developing the evaluation methodology, it is expected that an approach combining quantitative and qualitative methods of collecting and analyzing data will be used, in order to assess the outcomes and potentially impacts of the SESP, including a quantification hereof where possible and relevant, as well as explanations of the processes and interventions that fostered or hindered the achievement of these outcomes and impacts. To gain methodological rigor, the evaluation should estimate the counterfactual, that is, what would have happened had the SESP never taken place. In order to achieve this, it is expected that the evaluation methodology will include a comparative approach whereby the investments and progress made in the PIDs is compared to the same in non-PIDs, so as to generate lessons learned about the value added of the SESP strategies and approach.
3.1 Early signs of impact on the students, teachers, principals and schools
The SESP is intended to contribute significantly to the achievement of the 2012 vision described in the Core Document. Through the delivery of a range of outputs and services, the SESP is expected to have an impact on the behaviour and performance of key stakeholders at various levels ranging from students over schools to district- and central-level service providers.
A basic requirement in assessing the early signs of impacts on students, teachers, principals, schools, and the service delivery system in general is the assessment of the manner in which they have or have not changed and the extent that the SESP was responsible. Through its vision for 2012, the SESP Core Document describes the situation that is expected to prevail in 2012. The evaluation will assess the current status of progress towards this 2012 vision, and establish whether the 2012 vision remains a realistic and desirable target given the SSR and the current implementation, political and economic context.
3.2 Performance of the SESP
(i) Relevance of Objectives
The extent to which the SESP objectives are consistent with the MoE’s policies on secondary Education (as well as with relevant international frameworks) needs to be assessed.
The relevance factor determines whether the SESP was worth doing. It assesses whether SESP objectives were focused on the right priorities when designed, and if they were adjusted to suit changing circumstances during implementation.
Effectiveness is the extent to which the planned outputs, expected outcomes (immediate objectives) and intended impacts (development objectives) are being or have been produced or achieved. It should be noted that the assessment of early signs of impact has been singled out as an area of specific concern (see above).
Efficiency is the extent to which the SESP achieved, or is expected to achieve, benefits commensurate with inputs, based on economic and financial analysis or unit costs compared with alternative options and good practices.
Conventional project economic indicator – economic rate of return (ERR) – presents well recognized tests of efficient resource use. Sometimes, qualitative judgments by evaluators are necessary to assess efficiency, but always relying on an appreciation of the underlying concepts of cost/benefit analysis, together with good practice in similar situations and any other suitable indictors.
Assessments of efficiency (in the absence of ERR) should take into account, inter alia, the following factors:
- actual costs compared with appraisal estimates and any revisions;
- implementation delays and any redesign that may have increased costs;
- the level of benefits and their growth curves compared with expectations (if feasible);
- utilization rates for SESP facilities and services;
- whether services and facilities meet good practice standards; and
- whether the benefits stream appears adequate compared with the costs.
(iv) Sustainability of the SESP
Sustainability of the SESP and is an important part of the assessment, both in order to understand the performance of SESP so far, but also to facilitate integration of lessons learned and recommendations with regards to sustainability into future work. A further area of assessment that should be considered in relation to the issue of sustainability of SESP and the outcomes and impacts achieved, are the overarching factors of innovation and replicability of the SESP interventions.
3.3 Performance of the Partners
The performance of the partners needs to be assessed against the implementation agreements and planned outputs (products and services) set out in the SESP Core Document.
The items to be assessed include, but are not necessarily limited to, the following:
- The extent to which the MoE, ADB and Danida complied with the signed agreements;
- The extent to which the MoE and its implementing line agencies prepared and implemented annual plans and budgets (ASIP) in a manner that facilitated the timely delivery of the agreed SESP outputs in the agreed quality and quantity;
- The extent to which the donors made available the agreed inputs (financing and Technical Assistance) in time and in the right quantity and quality;
- The extent to which different implementing line agencies within MoE system at central, regional, district and school level, facilitated or hampered the SESP implementation and contributed to the achievement of SESP objectives;
- The extent to which the MoE’s coordination, management and oversight functions facilitated the timely identification and assessment of implementation problems and the timeliness and appropriateness of the corrective measures taken; and
- The extent to which the intended target groups took ownership of the produced/ delivered outputs and used these for intended/un-intended purposes.
4 Expected Outputs
After four (4) weeks from the commencement of the assignment, the Evaluation Team should submit an inception report and subsequently present their work at a seminar in Nepal (see tentative time line below). The consultants should undertake logistical preparation of the seminar. The inception report should outline the refined methodology, which include a matrix of analysis/evaluation matrix. This is expected to include the evaluation questions and to firm up the impact points, with indicators for each impact point, methodology for measuring the indicators, sources of data (including types, sizes and geographical coverage of PID and Non PID samples to be surveyed) and data collection instruments (questionnaires, focus group discussion guides, check-lists and etc.). It should further include preliminary results and pinpoint any specific challenges or areas of interest encountered or foreseen for the upcoming work. Based on the discussion and comments on the inception report, the consultants may be required to submit a revised version and/or a work plan containing any revision needed.
At the end of the period, the draft final report should be submitted and subsequently be presented at a seminar in Nepal. The consultants should undertake logistical preparation of the seminar. A hard and soft copy of the draft final report as well as a brief powerpoint presentation of the Evaluation Team’s key observations and analysis made as well as main conclusions and recommendations should be provided to the Management Group no later than three working days before the seminar.
The final evaluation report should be presented incorporating the feedback received at the seminar and within three (3) weeks from the date of the seminar. The final report should be a well documented comprehensive SESP Evaluation Report which addresses the objectives listed under Section 2 above and in a format acceptable to the MoE, Danida and ADB. The final report and a brief summarizing power-point presentation should be provided in hard- and soft copy to the MoE.
The evaluation process and the report must comply with international standards (see the Evaluation Guidelines of EVAL, Danida).
5 Evaluation Management
The evaluation will be jointly managed by a management group comprised of representatives of MoE and EVAL (the premises for the joint evaluation management and the inputs of the partners are clarified in the agreed minutes of the partners’ meeting in Kathmandu, February 2nd, 2009).
The management group will be responsible for the overall management of the evaluation (selection of consultants, chairing reference group meetings and seminars, quality assurance of the evaluations work etc.). An important aspect of the quality assurance is to ensure that the evaluation must be conducted in accordance with international standards, as mentioned above (see the Evaluation Guidelines of EVAL, Danida).
Further, two reference groups will be established, one based in Kathmandu, Nepal and the other based in Copenhagen, Denmark, to ensure the input of relevant stakeholders, help validate the findings and gain wider ownership. It is expected that the Nepal-based reference group will include (but not necessarily be limited to) ADB and the Monitoring and Evaluation Section of MoE. The reference groups will function in an advisory manner and give comments that the management group can carry forward to the consultants.
In line with international standards, it is stressed that the responsibility for assessment and interpretations rests with the consultants, and that the independence of the evaluation team will be maintained and respected throughout the process.
6 Logistical Arrangements
It is envisaged that the successful completion of this evaluation will rely heavily upon the mobilization of a strong team of specialists. The evaluation team should consist of consultants with extensive relevant international experience, as well as consultants with in-depth knowledge of Nepal and the education sector of Nepal. The team must possess strong competence and experience in the fields of complex evaluation, educational assessment, educational management, economics of education and international development cooperation. The team must further possess the relevant methodological skills required for an approach based on both quantitative and qualitative analysis, preferably including econometrics. The team must be able to work and write fluently in English and Nepali.
One of the team members will be assigned as Team Leader and should possess strong analytical abilities and excellent writing and presentation skills, as well as proven experience with complex, preferably joint, evaluations.
The duration of the study is expected to be from early March to late July 2009; see tentative timeline below. In order to complete the evaluation task in a timely manner, the consultants must be willing and able undertake at least two field trips to Nepal and be available for the Seminars.
A no conflict of interest policy will apply when selecting the consultants.
7 Tentative Evaluation Schedule
|03 Feb 2009
||Finalization of ToR.|
|04 Feb 2009
||Announcement of the Evaluation.|
|20 Feb 2009
||Deadline for submission of Expression of Interest.|
|06 Mar 2009
||Signing of contract.|
|06 Apr 2009
||Submission of inception report.|
|09 Apr 2009
||Seminar on inception report.|
|19 Jun 2009
||Submission of draft final report.|
|26 Jun 2009
||Seminar on draft final report.|
|24 Jul 2009
||Submission of final report.|
This page forms part of the publication 'Joint Evaluation of the Secondary Education Support Programme' as chapter 13 of 14
Version 1.0. 17-05-2010
Publication may be found at the address http://www.netpublikationer.dk/um/10395/index.htm