5 Programme Management
This chapter evaluates the performance of the main implementing partners in terms of managing SESP implementation in the most efficient and effective way (Sections 5.1 to
5.3) as well as the ownership of the main implementing partners to the programme (Section 5.4). The assessment is summarised in Section 5.5 and Section 5.6 includes specific recommendations for the consideration of the GoN and its partners.
5.1 Coordination and Management
One of the key questions to be answered is the extent to which MoE’s coordination, management and oversight functions facilitated the timely identification and assessment of implementation problems as well as timeliness and appropriateness of the corrective measures taken. As pointed out in the Core Document, the MoE is the executing agency for the programme. The responsibility for implementing the programme was delegated by MoE to the Department of Education, which along with NCED, CDC, JEMC and OCE was overall responsible for implementing the programme. The ESAT Office in turn was responsible for supporting the daily management of SESP.
The Core Document foresaw the establishment of the SESP Management Committee for programme coordination and the Secondary Education Programme Execution Board for overall strategic management. These bodies were not formally established but coordination did take place regularly through monthly coordination meetings of a more informal nature led by DoE with participation from, among others, NCED, CDC, OCE and JEMC.
Most implementing partners interviewed by the evaluation team have expressed satisfaction with the coordination role played by DoE. This is a noticeable achievement given that none of the implementing partners normally report to DoE. In fact, all of the entities normally report to MoE in the same way that DoE does. Some of the interviews did however reveal some degree of dissatisfaction with the way DoE has planned and allocated funds for capacity development interventions. The actual allocation appears to have been a result of bilateral negotiations rather than an analysis of needs and performance gaps. Accordingly, although DoE has played a satisfactory role more could arguably have been done to involve all partners early on in developing a coherent framework or strategy for capacity development interventions.
All stakeholders interviewed by the evaluation team have expressed satisfaction with the role played by ESAT in terms of serving the interests of the programme and providing funding for specific technical assistance (TA) needs. It further emerges from consultations in Kathmandu that the ESAT set-up has been well institutionalised into the MoE.
More strategic management was provided through joint annual reviews conducted by GoN and development partners. According to DoE, the relation to Development Partners has been good and constructive despite difficulties at the beginning. The same applies to the relation between the two donors. Towards the second half of the programme reviews of SESP were carried out in tandem with EFA reviews. However, this has not necessarily improved coordination between the two programmes as only very few EFA donors other than Danida and ADB would be present during the SESP part of the reviews. Only towards the end of SESP when SSR was being formulated and appraised have other development partners taken an interest in the programme.
The annual reviews were instrumental in re-designing the programme in connection with the 2006 Mid-Term Review (MTR). The MTR demonstrated responsiveness to poor implementation rates in the first years of the programme and agreed on a number of recommendations that were designed to provide for more efficient implementation. As the section below demonstrates this clearly led to more satisfactory implementation rates in the months and years following the MTR. However, not all of the recommendations agreed at the MTR have subsequently been followed-up. The agreement to put in place a coherent capacity development plan was for example not acted on in the years following the MTR. Moreover, it was not clearly indicated in the Aide Memoire from the MTR who should be responsible for following up this and many other recommendations made after the MTR.
Overall, the annual reviews are assessed to have played an important role in taking stock of programme progress and fine-tuning SESP interventions. It can however be argued that the DoE and the annual reviews were somewhat late in reacting to the low implementation rates in the first years. Some stakeholders have indicated to the evaluation team that the MoE/DoE could have done more to base its management decisions on data and information (evidence-based policy) rather than on other criteria. As an example, it has been reported that the Ministry and the Department have been late in responding to reports of late transfer of scholarships and salaries to teachers. Stakeholders have also shared information with the evaluation team that the various sections of the DoE could do more to exchange information and coordinate with each other. A more detailed analysis of DoE and its internal processes would be required to confirm or reject these views. However, it is clearly important that the decision-making bodies within DoE are closely linked to the sections responsible for collecting and analysing data on educational outputs and outcomes. There is clearly a lot of data being collected by the system, so it is important for the system to make the most of this data.
No changes were made to the indicators and/or target in the life of SESP despite major changes in context that could have justified this. The Aide Memoire for the November Joint Consultations notes for example that the need to revise targets was acknowledged. It is clear from the ToR to the MTR that such a review was anticipated, but the final documentation from the MTR does not address the need for revising indicators and targets.
One of the key SESP implementation strategies is, according to the Core Document, policy coherence and phasing to ensure smooth and effective implementation. The evaluation team has asked for evidence of this in its interactions. The key example given to the team is the decision to let the CDC develop curriculum before educational activities were initiated. This example is well in line with the aspirations of the Core Document in which it is stated that the teacher education activities were supposed to be linked and timed to the revision of the curriculum and the provision of new instructional materials. However, in practice the effectiveness of these strategies have been reduced by the delays in fielding TA inputs to these processes. Moreover, feedback from the various implementing entities suggests that the entities are mostly concerned with own mandates and responsibilities rather than with maximising collaboration between agencies.
Stakeholders point out that GoN has been relatively hesitant to take the lead in coordinating technical assistance. A more effective use of these TA inputs could possibly have been ensured if the inputs provided under the ESAT and the SESP consultancy, one of the major TA packages financed by the programme, had been more closely coordinated. The lack of effectiveness is further aggravated by the fact that the SESP consultancy was fielded with a delay of almost two years. This in turn implied that many of the tasks envisaged in the ToR were either irrelevant or left with inadequate time for meaningful implementation. The team was given a 10-month period to implement a programme originally designed for 30 months of implementation.
5.2 Financial and Physical Implementation
Interviews with MoE, DoE and the donors and review of Aide Memoires and financial reports indicate that development partners have generally complied with their agreements and have committed funds on time and according to budget. However, the development partners report that the large budget for TA is not likely to be fully utilised despite the fact that some of the TA funds have been diverted into construction purposes after the MTR.
Efficiency in the management of the programme can also be measured by the financial and physical implementation rates. SESP has generally lagged behind EFA in terms of budget utilisation, especially in the first years of the programme as illustrated by Chart 4.1. Especially expenditure related to construction and consulting services were behind budget in the beginning. The delays observed in the initial years of SESP mainly reflected delays in launching some of the most costly activities of the programme: Late initiation of civil works (mainly school buildings) and delay in procurement of the secure printing press. Provision of TA has similarly not been on target in all cases; especially the SESP Consultancy was awarded with significant delay. Moreover, the conflict, which peaked in 2004-06, is also believed to have had a negative impact on implementation rates. The low implementation rates have most likely implied that some of the wider effects and impacts of the programme such as increased schooling and improved student performance will materialise later than originally anticipated.
Chart 5.1 – Expenditure against budget
Source: GoN, 2008d:81.
The financial implementation rate has picked up in recent years as Chart 5.1 illustrates. Budget utilisation reached 83.2 per cent in FY 2007/08 compared to 69.1 per cent in the preceding year. As further elaborated in the section below, the MTR played a key role in facilitating a more efficient fund flow. Moreover, the cessation of the insurgency in 2006 clearly had a positive impact on efficiency. In summary, programme implementation is believed to have been relatively efficient, especially in view of the fact that the programme was implemented during some of the most intense years of the conflict.
5.3 Financial Management
Financial management issues have also had a negative impact on physical implementation rates. Interaction with districts, schools and the Office of the Auditor General all suggest that fund flows from central level and all the way down to the individual school have been marked by delays. This is also confirmed by a 2008 Action Plan for Financial Management Improvement adopted by DoE (GoN, 2008i), which acknowledges delayed (first trimester) fund release, delayed submission of financial monitoring reports and implementation progress reports, late submission of audited financial statements from schools, poor performance in performing social audits and, more generally, capacity constraints at all levels of the system to undertake financial management in accordance with the established procedures.
The Office of the Auditor General (OAG) cites examples of incomplete amounts being released to districts and examples of DEOs being late in releasing funds to schools. The OAG further reports that DEOs may have been hesitant to release funds to schools due to a perception that money will be misused at school level. DEOs interviewed by the evaluation team further report that the reports required from schools on previous activity and spending are often submitted late. Without these reports, the DEO cannot release funds for the new financial year. This however does not seem to be clearly communicated and/or well understood by some of the interviewed School Management Committees.
Late preparation of reporting at central level has been another cause for fund release delay. At first donor-specific reporting formats were imposed placing substantial transaction costs on the DoE, according to DoE Physical Section. The original donor-imposed format had 22 headings. A simpler GoN format with only six headings was subsequently agreed. Reporting in the beginning of the project further had to pay particular attention to the exact share financed by Danida, ADB and GoN as it had been agreed that the relative contribution of each partner would vary by the type of activity being financed. However, the 2006 MTR introduced a single common percentage across all loan categories, relieving the DoE of yet another administrative burden.
The start of the school year varies across the country and is not coherent with the mid July start of the fiscal year, which adds further delays to programme implementation including delays in disbursing scholarships.
According to the abovementioned action plan (GoN, 2008i) several initiatives have been launched to speed up fund release efficiency and improve financial management in general. This includes negotiation of agreements on modalities with the National Planning Commission (NPC) and the Ministry of Finance and provision of audit guides together with one-day orientations to school auditors in the five development regions. While the action plan is laudable and the follow-up taken as a sign of commitment, the plan could be further strengthened by including clear indicators with targets to better allow for an assessment of progress in terms of how much delay has been reduced, how many schools are subject to audits etc.
While the bulk of funds under SESP has been transferred through the normal GoN system as described above, ESAT has played a much-appreciated role in facilitating and making short-term TA available through a so-called direct funding facility managed by ESAT. As reported by CDC, it has been good for efficiency to have access to the ESAT-managed direct funding facility, which allows agencies to obtain funding outside the normal GoN budget. This however is not in line with the principles of the Paris Declaration on Aid effectiveness, which prescribe the use of national financial management systems to the extent possible.
As pointed out by the DoE Programme and Budget Section, challenges were encountered by the DoE in terms of procurement. Most procurement was local and followed national rules, but major procurements exceeding USD 100,000 followed ADB rules and required no-objection from ADB Headquarters. The above-mentioned delay in procurement of the SESP consultancy, for example, was a result of the need for DoE to exchange the tender dossier 5-6 times with the ADB headquarters in Manila. For many officers in the DoE, it was the first time to procure a substantial service contract according to ADB guidelines and it is therefore not surprising that delays occurred. Moreover, given that the service contract involved several agencies (DoE, NCED, OCE and CDC) the preparation and agreement of the ToR took longer than expected.
The SESP was largely drafted by external consultants. According to some DoE officials, the drafters of the programme document did not consult properly with Nepalese education officials and organisations. Similarly, a MoE official interviewed by the evaluation team pointed out that there were incompatibilities in the original SESP programme between the programme side and the corresponding budget. The reported lack of consultation and incompatibilities are to some extent confirmed by the fact that several major changes were made to the programme at the MTR. For example, several activities had to be fine-tuned to better match with the available financing.
Despite a difficult beginning including what seems to be inadequate involvement of key stakeholders, the feedback from the interviews clearly suggests that the GoN has demonstrated increasing commitment to the SESP over the period of implementation. The overall commitment to the education sector is evident by Nepal’s commitment to the global EFA targets reflected in the Jomtien and Dakar conferences and the Millennium Development Goals. It is also notable that SESP has been implemented within existing institutions with most of the funds being managed “on budget”.
All schools visited by the evaluation team expressed that the SESP investments responded to a strong interest and need at the local level. However, only few of the schools interviewed by the evaluation have prepared maintenance plans. However, despite the absence of a plan and limited access to specific maintenance funds, all schools were convinced that they would be able to provide the required amount if necessary: Some schools expected that a budget would be set aside by DoE/DEO for this, while some also expected to be able to contribute with minor, inexpensive maintenance works.
On balance, ownership to the programme is assessed to be high at central level. A similar conclusion can be made for the district/ school level especially if the schools prepare and implement maintenance plans for the various infrastructure investments provided by SESP.
The overall assessment is that DoE has done a satisfactory job to secure that the various outputs of SESP have been delivered despite delays in the beginning of the program. Hence, efficiency is overall assessed to have been satisfactory. Management of TA has however not been satisfactory as evidenced by the delayed fielding of the SESP consultancy. The ESAT facility on the other hand has been welcomed by the GoN and has been well institutionalised into the MoE. Financial management has largely been efficient but there are indications that the fund flow mechanisms need to be improved, especially with respect to the DEO transfer to school level. To this end, it is positive that DoE has adopted an action plan to improve financial management performance. Overall efficiency has also been hampered by the requirement to use external procurement guidelines with which the relevant officials were not familiar. It is assessed that the annual review has acted appropriately, although with some delay, to revise the programme in a way that allowed for more efficient implementation.
DoE’s style of management could arguably have been more strategic with a view to increasing effectiveness. Specific allocation decisions could have benefited, particularly with respect to capacity development, from being guided by a strategic framework rather than being the result of a series of bilateral negotiations. Likewise, it is somewhat surprising that no decision was made to revise the key performance targets for the programme although several of these were achieved already at an early stage of the programme.
The programme has been improved along the way by aligning to national reporting requirements and removing other donor specific requirement. This has brought SESP better in line with the principles of the Paris Declaration on Aid Effectiveness. This in turn bodes well for sustainability and demonstrates commitment from the development partners.
Finally, it is positive from a sustainability point of view that SESP has largely been implemented trough existing, national structures. Accordingly, the lessons learned, good practices and experience generated from implementing SESP will be available to draw on for the design, implementation and fine-tuning of the SSR.
5.6 Specific Recommendations
The following recommendations are proposed for consideration by the GoN and its partners:
- Increase attention towards following up on review decisions.
As a minimum, Aide Memoires should specify who will be responsible for driving forward the various decisions adopted, specifying the deadline. Decisions on recommendations should take into account the resources and capacity of the entity proposed to carry out the proposed activities.
- MoE should take ownership for management of TA in the interest of improved coordination and effectiveness.
The MoE should take ownership for the process beginning from the initial needs identification to the actual procurement and management. The evaluation has learned that under SSR it is envisaged that the Foreign Aid Coordination Section (FACS) will act as the only access point for government entities in the educational sector to request TA and it will then be for FACS to liaise with the various donors who have pledged to make such funds available. This structure is a notable step in the right direction.
- The GoN should move towards a situation in which all of the TA funds are managed on-budget, including those funds currently managed by the direct funding facility.
To bring management of TA in line with the principles of the Paris Declaration it will be necessary to do away with separate flows of TA funds. Such flows are foreseen to continue under SSR but it is recommended that MoE through FACS gradually seeks to engage its partners in a dialogue as to how and when TA funds can be managed according to Government priorities and procedures.
- Define and consistently measure performance against specific indicators to track efficiency of fund flows all the way down through the system.
It is necessary for the GoN more quickly and more precisely to identify and deal with the potential “blockages” in the system. As mentioned the Action Plan on Financial Management Improvement signals the GoN’s commitment towards sound financial management. By including specific indicators, targets and baseline data in such a plan, the GoN will be able to assess more precisely the progress made. One key indicator to be included is the processing time to channel funds from central level to the accounts of the individual schools. It may furthermore add value to undertake detailed expenditure tracking surveys to identify the actual bottlenecks in the system – and estimate the scope of the delays and improve on these.
This page forms part of the publication 'Joint Evaluation of the Secondary Education Support Programme' as chapter 7 of 14
Version 1.0. 17-05-2010
Publication may be found at the address http://www.netpublikationer.dk/um/10395/index.htm