Harmonization Around Results Reporting: Asynthesis Of Four Country Studies

AuthorWhite/R-García/Balasundaram
PositionSenior Results Management Specialist at World Bank/Senior Specialist and Consultant to the World Bank/Consultant
Pages12

This study was conducted for the Organization for Economic Cooperation and Development-Development Cooperation Directorate ("OECD/DAC') Joint Venture on Managing for Development Results.

Page 33

Introduction

This article discusses "harmonization around results reporting" within the international context of improving aid effectiveness. It seeks to shed light on the inter-relationship between external reporting for donors and internal reporting for national accountability. Harmonization refers to increased coordination and stream-lining of activities by different aid agencies based on the sharing of information in order to promote transparency and improve coordination; gradual simplification of procedures and requirements to reduce their burden on partner governments; and development of common arrangements for planning, managing, and delivering aid. The concept is often expanded, like in this article, to also include issues of alignment and ownership. Examples of such issues are: (1) the government taking the lead in coordinating donor efforts; (2) donors relying on country systems and procedures; and (3) development agencies delivering aid in accordance with partner country priorities.

Section I provides a brief description of the international context for harmonization around results reporting. Section II summarizes the evolution to harmonization in four country cases - Uganda, Tanzania, Mozambique, and Madagascar. Section III proposes three critical factors that shed light on how to deepen efforts for harmonization around results reporting. Each country context is unique and has unique leverage points for action around harmonization. Therefore, this article does not attempt to provide operational steps for moving toward harmonization. Instead it captures experiences and draws lessons that might be applied to these and other countries; mainly, that there is progress, but it is slow and not easy. Additionally, this article assesses three critical factors found to deepen efforts for harmonization around results reporting: (1) developing a reliable basis for reporting; (2) fostering country ownership of results reporting processes - the primacy of country defined results, as opposed to externally driven results; and (3) providing programmatic, rather than project only, coordinated support for national systems.

Global Context For Harmonization

At the Monterrey International Conference on Financing for Development1 in 2002, as donors pledged significant increases in aid levels, they recognized the need for development agencies and partner countries to strengthen their focus on results and development effectiveness. They also acknowledged that achieving results requires better use of resources, and that better aid, not just more aid, is part of donor responsibility. Since then, the need to better manage for results-to use information to improve decision-making and steer country-led development processes toward clearly defined goals-has become a central element of the global development agenda. This commitment to better manage for results was renewed and increased in the 2004 Marrakech International Roundtable on Managing for Development Results,2 the 2005 Paris High-Level Forum on Harmonization, Alignment, and Results,3 and the 2005 G8 Gleneagles Summit.4

The 2004 Global Monitoring Report5 shows that many developing countries have made progress in accelerating growth and reforming policies,6 for instance in the areas of budget, financial management, and corruption in the public service. Yet, while progress has been made, enormous development challenges still face low-income countries, especially in Africa. This is coupled with new commitments to the doubling of aid for Africa, recently made at the Gleneagles Summit, and estimates that aid to all developing countries will increase by around $50 billion per year by 2010. Much of this will go to low-income countries, through the framework of their national strategies (second generation poverty reduction strategies). There will be increased expectations that these funds will be effectively used in countries and will be driven by participatory and transparent development processes. The resounding theme is that country-owned and -led development is critical to achieving and sustaining results.

Evolution Of Harmonization Around Results Reporting

Efforts to coordinate monitoring and reporting in programs and sector wide approaches ("SWAPs") played a significant role in the move toward harmonization around results reporting. Traditionally, reporting was concerned mainly with inputs (finance) and outputs according to different donors' formats and reporting needs. With the introduction of SWAPs, donors were increasingly confronted with more budget-wide financing than project-specific financing. Donors were also faced with the need to better coordinate reporting and use country systems, given the situation that several donors support the same broad sectors or programs in the public sector. This led to reporting requirements at higher, more centralized levels as opposed to reporting in a project-by-project manner. It also led to an enhanced appreciation of the need to strengthen country monitoring systems. Page 34

Numerous other strategies to harmonize reporting results followed SWAPs. The preparation of national development plans ("NDPs") and/or Poverty Reduction Strategies ("PRS") provided both an opportunity and a vehicle to expand the dialogue on results reporting to additional sectors and donors. It offered a common basis for donors, government, and civil society to identify a strategic approach to poverty reduction, an operational framework to achieve national goals, and a mechanism to develop cross-sector linkages. Conceptually, the strategy and programs in the PRS could easily translate into a monitoring system for tracking progress. This would enable donors to select indicators associated with policy areas and use these for external reporting. While the principles of the PRS promoted country ownership, participation, and a focus on results, the demand for results reporting was externally driven. During implementation, efforts to harmonize around results reporting had mixed results. The PRS process underscored the need for good reporting and for strengthening countries' capacity to report on results.

Country Context For Results Reporting

Many countries lacked monitoring and evaluation systems for reporting, or the systems that were in place were fragmented and overlapping. There was inadequate institutional and organizational capacity to establish coordinated monitoring systems,7 even though for several years prior, donors had supported project monitoring and evaluation. Since few countries had taken a program- matic perspective in developing monitoring and evaluation systems, there was little analytical work on how to coordinate donor support for strengthening country systems in a sustainable way.

Many countries did not yet recognize the value of using results information in policy making processes or in program management.8 Instead, they focused on tracking expenditures, monitoring high order socio-economic indicators, and reporting to donors. Countries concentrated less on how results information could be valuable for effective and efficient public sector management. There was little experience to draw on for using results information as part of public sector management and for understanding the different requirements for information and organization in results based approaches.

The introduction of PRS acted as a catalyst for a more coordinated approach to poverty monitoring. The increased demand for reporting on results helped put governments and donors on a forward track in addressing data constraints. This translated into support for poverty monitoring systems, including efforts to increase the participation of civil society and improve access to information. Much of donor support focused on the supply and coordination of data, such as conducting surveys or supporting the census. Donor support otherwise tended to focus on each donor's own reporting requirements and priorities.9 Less attention was paid to how results information would be used for government purposes, such as policy making, budgeting, or management.

Advancement Towards A More Holistic Understanding

There was little apparent recognition that harmonization around results reporting could facilitate or impede progress in establishing sustainable country systems. Later, analytical work began to evolve that assessed the use of information and the broader institutional setting for monitoring. This included increased attention to expenditure management, the flow of information, and analytical/evaluation skills. This work helped deepen the debate and understanding of the complexities of developing a poverty monitoring system and the importance of balancing the various roles that the monitoring system was expected to serve, such as external accountability to donors, a country tool for policy decisions, and in-country accountability to citizens.

Over the same period, the international community intensified efforts for broadly improving aid effectiveness through better aid coordination and improved harmonization and alignment across a range of activities. At the country level consultative group meetings and other mechanisms for government-donor coordination, such as sector working groups, offered a vehicle for dialogue across specific subsets of donors. At the global level, critical issues of aid effectiveness were being discussed through high level round-tables and the Organization for Economic Cooperation and Development-Development Cooperation Directorate ("OECD/DAC") collaborative actions. Important topics included improving the predictability and reducing the volatility of aid, rethinking conditionality, and deepening the understanding of country ownership. The international community also recognized the important role of managing for results as a key tool for countries. These global messages and debates were increasingly supported at the country level, by both countries and in-country donors, where the broad principles discussed in the global debates were made operational and provided a useful basis for discussion of harmonization around results reporting.

Country Experiences In Harmonization: The Cases Of Madagascar, Mozambique, Tanzania, And Uganda

All of the contextual factors outlined above played an important role in progress toward results reporting in the four country cases. The evolution of each country was unique and helped to identify a number of elements that served to promote and support harmonization around results reporting.

The experiences in these four countries demonstrate that the development community is evolving in its understanding of the country and international context that comes together in harmonization around results reporting. The four cases show that externally driven results reporting systems present challenges to the country-led model of development, and that a country-led results reporting system can satisfy both the internal reporting needs of countries and the external reporting needs of donors. The cases also show that the donor community and countries are together putting in place systems that serve country and donor needs-albeit sometimes through trial and error. The review of the four country cases shows that harmonization around results reporting is facilitated by robust technical elements such as:

- Strategies and programs designed to enable continu- Page 35ous and systematic assessment against results-based objectives.

- Supporting monitoring systems with well defined indicators that cover monitoring of resource use (financial and human), tangible outputs (such as efficient extension of services to farmers), intermediate outcomes (such as improved use of inputs and new farming techniques), outcomes (improved yields and reduced losses from weather shocks), impacts (improved agricultural productivity), and links to broader national objectives (such as increased agricultural exports).

- Well defined and reliable data that are feasible to collect and simple to monitor.

- Analytical capacity to translate routine monitoring information into evidence to support decisions including cost benefits.

Experience is also showing that technical solutions alone are not sufficient. Equally important are the following:

- Country ownership of the programs, as well as the existence of incentives to use information on results in expenditure decisions (with links to the budget or medium-term expenditure frameworks). This includes balancing country accountability with external accountability.

- Institutional arrangements for reinforcing how information flows would be used.

- Understanding of the political nature of results reporting.

Madagascar

The highly participatory PRS process, harmonization around sector wide approaches, the drive for enhanced country monitoring systems, and achievement of sustainable results on the ground from the National Environmental Action Plan ("NEAP") all contributed to an evolution and progress in the harmonization agenda around results in Madagascar. Harmonization around results reporting in this country commenced with the implementation and coordination efforts of a group of donors in the context of environment and biodiversity in the early 1990s. Madagascar's unique background as an island nation, endemic to many of the world's endangered species, and highly vulnerable to shocks, provided many lessons on harmonization in the context of the environment and its broader application to issues of results and sustainability for PRSs as a whole. Lessons on harmonization around results reporting were derived from the environment sector and its implementation, even in the context of its PRS program and recent efforts to strengthen the Poverty Monitoring System ("PMS").

Madagascar was the first country in Africa to elaborate a NEAP. This occurred six years prior to the signing of the Convention on Biological Diversity. The first phase of the NEAP was initiated in 1991 in the face of a limited conservation baseline with the support of a broad coalition of bilateral donors. These donors included Germany, France, Norway, Switzerland, and the United States; multilateral institutions, such as the United Nations Development Programme ("UNDP") and the World Bank; and non-governmental organizations, namely Conservation International, World Wildlife Fund, and Wildlife Conservation Society.10 A Multi-Donor Secretariat, which was co-financed by USAID, France, and the World Bank, was set up during the second phase of the NEAP to carry out coordination and enhance implementation of the program. Multi donor supervision missions have been conducted twice a year since 1996.11Recently, Japan and the Global Environment Facility ("GEF") have also joined this group.

Donors began to harmonize budget support assistance in 2003. Several activities in the mid-1990s lay the groundwork for a coherent PMS in Madagascar.12 Lessons on harmonization around results reporting, and monitoring and evaluation, in the environment sector provided useful lessons for donors in shaping results oriented program formulation, planning, monitoring of results, and evaluation over a period of time. It was not clear that donors and the Government had taken these lessons into account in the design of current programs.

Mozambique

The evolution to harmonization around results reporting started with improved links between sector strategies and national results. Mozambique has a long tradition of planning in the public sector. This planning is centrally-driven and relies on a sector-based approach. Historically, sector policies have been aligned with donor priorities. However, over time donors and the government observed a weak link between these sector policies and the national budgeting and priority setting process. In 2001, the introduction of the Plano de Acao de Reduao da Pobreza Absoluta ("PARPA"),13 Mozambique's Poverty Reduction Strategy Plan ("PRSP"), helped improve alignment of sector strategies to national objectives. It also brought greater coherence across sectors and improved policy coherence of sector strategies with the government's overall policy thrust.14

The PARPA, coupled with the medium-term expenditure framework ("MTEF") process, which was first formulated in 1998, included detailed plans, timelines, and indicators. This formed the basis for developing monitoring systems that built on already existing mechanisms. Donors supported the PARPA through budget support, sector wide approaches, and project financing. While some form of budget support had been delivered through the 1980s and 1990s,15 it was linking budget support to the PARPA that led to increased coordination among donors and initiatives in support of harmonization and alignment.16 In 2002, preparation of a detailed matrix of PARPA activities by sector revealed a need to make information about PARPA priorities more systematic and to monitor a meaningful number of indicators.17

The Government led a process to define a performance assessment framework, requesting all PARPA sector ministries and those responsible for cross-cutting reforms to identify pri-Page 36orities for the coming three years. This resulted in a 56-page Matrix, called the long Performance Assessment Framework ("PAF"), which was then reduced to a short PAF consisting of two pages of key priorities.

As part of reporting, there were a series of annual exercises that led to issuances of reports to Parliament on the progress in PARPA implementation. The government integrated the required annual progress review ("APR") of the PARPA into its own regular reviews of implementation. This dual role for the Balan o do PES - reporting to Parliament and to donors - simplifies national reporting requirements and "provides an adequate evaluation of the progress made in achieving the PARPA goals." Unfortunately, a 2004 independent review noted that only a few donors comply with government reporting requirements, and donors have failed in attempts to reduce the govern- ment's administrative burden. While both donors and government have demonstrated a commitment individually and jointly, the future will likely require even greater efforts if progress on harmonization in results reporting is to continue.

Tanzania

Efforts for harmonization around results reporting in Tanzania started at a low point in government-donor relations in the early 1990s. At this time there was a large donor community, high aid dependency,18 high transaction costs in dealing with the range of uncoordinated priorities, and the associated procedural and reporting requirements of multiple donors. In 1995, an independent evaluation of donor/government relationships resulted in a commitment to development cooperation and establishment of a mutual accountability mechanism.19Tanzania and its development partners were also using sector wide approaches to create opportunities for coordinated work in support of sector specific results. However, weak monitoring systems undermined the utility of the sector wide approaches for harmonization around results reporting.

The donor community had a good grasp of systems and technical issues and wanted to deal with the myriad of systems in place. At the same time, the government was aware that strengthening public financial management systems and introducing effective poverty monitoring was a precondition for persuading donors to allocate more development assistance through budget support.20 In 1999, the introduction of the RSP and focus on PMS provided an opportunity for Tanzania to advance harmonization of results reporting. The PRS included targets and indicators intended to inform programming decisions, and comprehensive institutional arrangements for assembling, analyzing, and disseminating poverty data. The PRS annual progress report provided an opportunity for donors to use government reporting for common results reporting.

In practice, the annual reviews were somewhat insubstantial, resulting in donors working on parallel policy and reporting matrices. Budget support donors developed a performance assessment framework to which budget support instruments were linked, helping to harmonize results reporting for donors. However, there was a disconnect between the donor-driven PAF and the government's PRS requiring further harmonization of the PRS action plan and the PAF over a three year period. In 2002, the government and donors created the Tanzanian Assistance Strategy, which included guidance on harmonization around results reporting, such as integration of reporting and accountability systems. It also attempts to better align donor support with the use of country systems. There are on-going efforts to improve the basis for results reporting. However, there remains a lack of incentive to produce good quality data and to use results information in policy and budget decisions. Attempts to link the PMS with public sector management have started.

Finally, in the current budget cycle, the procedure for collecting sector inputs is being modified in a way to put pressure on all sectors to justify their bids in terms of the relevant cluster strategies in the PRS. Sector policy makers have thus an increased incentive to develop outcome-oriented rationales for what they do with their allocations from public resources and make use of data on results.21 Tanzania has progressed considerably in moving to a results-based planning and monitoring reporting system.

Uganda

The move toward poverty reduction and reforms in 1986 and international pressures for demonstrating the effectiveness of aid set the conditions for harmonization around results reporting in Uganda.22 Uganda was able to qualify as the first heavily-indebted poor country ("HIPC") beneficiary, because as early as 1997, it had put in place a strong domestically owned poverty reduction and development policy framework locally known as the poverty eradication action plan ("PEAP"), later transformed into the PRS.

Except for a few exceptional circumstances in which external development assistance is reported to have been managed below general expectations, Uganda's case displays a success story of aid effectiveness. This is evidenced through the gains made in recent years, especially in the social (education and health) sectors, where aid resources have had remarkable impact on growth and social development indicators. Most of the achievements in these and other sectors are largely attributable to an improved and enabling policy environment reflected in the budget process and improvements in public finance management, and a series of anti-corruption initiatives accompanied by a high degree of transparency and accountability in the use of public resources.

The evolution of harmonization in Uganda is characterized by a strong country ownership/leadership in undertaking essential reforms and improving management of the aid portfolio to enhance effectiveness. Uganda was one of the first countries in the sub-Saharan African region to have fully developed its own PEAP as early as 1997. This has continued to play a central role in fostering country ownership of the development policy process, especially towards ensuring sustainability of the achievements. PEAP objectives have been increasingly realized through a strong ownership of the development process as led by the Ministry of Finance Planning and Economic Development. Additionally, the PEAP were coupled with the development of a holistic national strategy against Page 37 hard budget constraints and anchored in a medium term expenditure framework.

These achievements encouraged development partners to shift their approaches and practices toward a country-led partnership model. Movement toward budget support and sector wide approaches helped reinforce country ownership and donor partnerships. There were efforts by the government to enhance the results orientation of its strategy, monitoring, and processes.23 Efforts to demonstrate results were institutionalized out of a rational, outcome-oriented budget process focusing on domestic accountability as well as external accountability through mechanisms such as budget support instruments, e.g., the World Bank Poverty Reduction Strategy Credit. A hard budget constraint and the MTEF have been the key financing channels for the PEAP. Uganda's social economic performance highlights the fact that achieving results requires better use of resources. All stakeholders in Uganda's development process have for the last four years actively participated in the national budget formulation process with a view to ensuring that public resources are channeled to those areas where they will have quick and sustained impact on poverty while contributing to sustainable economic growth.

Over time, the quality of the monitoring and evaluation systems supporting results reporting improved and some donors harmonized around results reporting, especially in budget support, sector programs in health/education, and national multi- sector programs such as HIV/AIDs. There are also increased efforts to develop more sector wide approaches in addition to health and education (i.e., water, energy, agriculture, public service reform, and procurement, among others). However, weaknesses in country monitoring and evaluation systems are still a factor in full donor reliance on Uganda's monitoring system for results reporting. There has been recent progress in developing Uganda's monitoring and evaluation system through the National Integrated Monitoring and Evaluation strategy ("NIMES"). Challenges remain in how to make best use of information on implementation of the PEAP in the domestic policy making processes.

Key Factors In Turning Harmonization Concepts Into Practice

As the case-studies demonstrate, much work has been undertaken and countries and donors are moving forward in tackling technical and non-technical dimensions of harmonization around results reporting, but progress is slow and the process challenging. Three critical factors have been identified that shed light on how to deepen efforts for harmonization around results reporting: (1) developing a reliable basis for results reporting; (2) fostering country ownership of results reporting; and (3) ensuring programmatic support for capacity building in monitoring and evaluation.

Factor One: Developing A Reliable Basis For Results Reporting

Donors must report on country results of aid financing to their domestic constituencies. The responsibility of donors is to provide information on fiduciary accountability (money spent and inputs/outputs) and on development effectiveness from the aid provided (what are the benefits of the money spent). In order for donors to use information from country systems for external accountability, the information the systems provide must be reliable. In most countries, the information needed for results reporting is not captured in country monitoring systems or is patchy at best.

The absence of a results orientation often translates into donors - rather than countries - identifying inputs-outcomes links and associated indicators for results reporting, or reverting to reporting on activities or disbursements. A recent review concluded that most indicators in monitoring systems for the PRS are budgetary/expenditure (input indicators) and survey-based measures of well-being, such as impact indicators for poverty reduction.24 Less attention has been paid to articulating important intermediate steps that are responsive to shorter term measurement.25 Thus, while there are many issues with the technical quality of data, a key bottleneck is the weak results orientation in strategies and supporting processes.26 There are two concrete actions that the donor community and countries can take to strengthen this often overlooked dimension of harmonization around results reporting: (1) develop stronger linkages between policy actions and results; and (2) tailor results reporting to government policy processes.

In a results-based approach, there would be well established links between policy choices and intended medium term outcomes, and how these would impact achievement of longer term goals, such as the Millennium Development Goals ("MDGs"). These linkages should be based on sound analysis and identification of key constraints and priorities (evidence-based). For harmonization around results reporting to be useful, indicators would be responsive to program implementation or short term policy actions, not only longer-term high level goals. What does this mean in practice? Take for example a common objective of countries and donors, "to make progress towards the MDG of reducing income poverty." To make progress in that direction, countries will need to achieve the associated goal of increased off-farm and on-farm income through medium term outcomes, such as increased agricultural productivity. Knowing if the country is on track to increased productivity requires short term measures to help donors and governments know whether the interventions, such as new farming techniques, improved land tenure, and use of micro-finance, are working.

The Uganda PEAP provides an example of the difficulties in developing a monitoring matrix that is useful for results reporting and shows how donors and the government worked together to address weaknesses in the basis for results reporting. At design, most sectors did not articulate intended chains of causation. This resulted in monitoring matrices that were heavily skewed toward final outcomes and impacts, missing specification of intermediate results from policy actions. Higher-level indicators of progress towards the PEAP targets could only be measured at relatively long intervals. Even when data was available, it was difficult to discern the causal contribution of the Page 38 PEAP to the changes in these indicators or to draw policy lessons from them. Some donors indicated that the PEAP was insufficiently detailed to influence allocation decisions and was unable to relate the expenditure on investment (policy and program decisions) to outputs (observable in the short-term) and intermediate outcomes. This issue was addressed in the latest PEAP revision, and work was initiated on the development of a PEAP results and policy matrix. The PEAP matrix specifies the key developmental results the PEAP is trying to bring about and the annual policy actions that are expected to contribute to these results. Indicators and targets are set both at the level of broad developmental outcomes and at the level of policy actions. The new matrix should ease both the management of PEAP implementation and the monitoring of progress with PEAP implementation. It will also be a useful tool to inform dialogue between Government and a range of domestic and international stakeholders.27

In several other countries, inadequately defined linkages between inputs and expected outcomes in the PRS resulted in donors developing performance assessment frameworks primarily for donor reporting purposes. These donors were then challenged with filling in the missing gaps in information needs. Conceptually, the indicators in the PAF represented a subset of indicators from the PRS. In practice, other indicators were often negotiated by governments and donors providing broad budget support.28 In Mozambique and Tanzania, useful processes for coming to agreement on a set of results indicators and alignment of relevant donors to the budgeting and reporting process of government included discussions on the Performance Assessment Framework ("PAF") or donor specific results matrices, such as the World Bank Poverty Reduction Support Credit ("PRSC"). In Uganda, the 2004 PEAP introduced a Policy Matrix and a Results and Monitoring Matrix that started to bring the PRSC and other monitoring and evaluation ("M&E") arrangements in line with the PEAP policy matrix. The PEAP policy matrix will be the only instrument for assessing progress towards attainment of PEAP objectives. This should result in future iterations of the PEAP and supporting PMS to encompass a broader range of results information, thus adding to its usefulness to donors for results reporting.29 These efforts evolved over the initial years of PRS implementation and have been positive steps toward harmonization around results reporting on budget support.

There is a danger that donor negotiations can result in a proliferation of indicators as donors want to include their focal areas or vertical programs with relevant indicators, into the matrix. While there are obvious implications of this in terms of data collection, monitoring, and reporting, the recent World Bank Conditionality Review 200530 pointed out that this practice could lead to an increase in the number of conditions and "the quality and relevance of the substance could suffer."31Additionally, this practice, if not done properly, could undermine country ownership, duplicate reporting mechanisms, and miss opportunities to strengthen country capacity to use meaningful indicators for expenditure management and policy decisions. While the details of the debt proposal remain to be worked out, one important result of this initiative will be to shift more resources towards unrestricted budget support rather than to specific projects or programs. This underscores the imperative for establishing processes that are supportive of strengthening country systems.

A key consideration for results reporting is how to make results reporting a subset of a country monitoring and reporting system that feeds into the policy process. Experience has shown that: (1) monitoring systems only function where the participants see them as useful and legitimate; and (2) where the monitoring arrangements emerge out of a common commitment to solving practical problems, they have a much greater chance of success. Without tailoring results reporting to the policy process, the risk is that monitoring results becomes more of a requirement than a useful tool, often precipitating a decline in Page 39 the reliability of the information produced by the system. For reliable results reporting, this implies that donors need to foster increased ownership and use of monitoring systems by government by building on country processes.

The PRS established a process for review that provided an opportunity to strengthen the use of country monitoring and reporting systems. Each year, participating governments were expected to produce a review of progress in implementation, based on evidence, known as the APR. These reviews were originally conceived with donor reporting requirements in mind, but many APRs have not provided sufficiently detailed information for donors, focusing primarily on impact and outcome level data. As a result, the APR has not been systematically used by donors (see Box 2). Since the APRs were often viewed as an external reporting requirement, they also were not systematically used by governments. A 2004 evaluation of the PRS noted that "governments in most countries are monitoring results as a requirement, and results are not being used to adjust strategies or to enhance accountability for performance."32 Many of the Joint Staff Assessments by the World Bank and the International Monetary Fund ("IMF") stated that a major constraint for the PRS implementation was weak integration of monitoring systems into the policy making and accountability process. Even in the more successful cases of Tanzania and Uganda, there was less focus on strengthening the links between deriving information for donors and how to make this useful for the Government's work reporting and policy processes.

There are also examples of harmonization around results reporting in programs and projects. In many cases, harmonization around results reporting has evolved over time in a sector or theme. Madagascar's NEAP provides a good example of how donors and government have evolved to harmonization around results reporting. As NEAP moves to its third iteration, there is a strong relationship between government and donors, based on an agreed set of expected results and indicators. A joint steering committee: (1) ensures that government and donor investments are defined and implemented in a manner compatible with the results framework and agreed upon indicators; (2) monitors progress towards agreed upon results; and (3) provides strategic orientation and guidance for the overall program implementation and coordination with other sectoral and development programs. Participating donors (such as the World Bank, United Nations Development Programme, and Global Environment Facility; French, German, Japanese, Swiss, and U.S. bilateral programs; Conservation International, World Wildlife Federation, and Wildlife Conservation Society) are then able to use the M&E system for their own results reporting. The M&E system with common indicators enables a more direct linkage between financial sources and results on the ground, while avoiding the need for donor coordination at the activity and input level. The process is thus conducive to both harmonization around results reporting and the reinforcement of government systems.33

Factor Two: Fostering Country Ownership Of Results Reporting

Country ownership is essential to achieve development outcomes and to foster continuity in priority setting across political cycles. The current context of increased aid flows and the associated need for external results reporting presents a challenge to maintaining country ownership. The MDGs, the G-8 initiative, and vertical programs to achieve specific outcomes (e.g. infrastructure, health, education)34 will likely increase pressure on donors to report on progress toward international goals to external taxpayers. This may also encourage a greater funneling of resources through vertical, sector-specific channels rather than through country systems (see Box 3). Experience shows that, if ignored, these pressures can translate into results reporting systems that track data for an external audience, but lack legitimacy and ownership by those who are setting policies, implementing Page 40 programs, or monitoring results. A recent development committee paper on aid effectiveness and aid financing explicitly notes that "it is important to not undermine country ownership of the development agenda....the use of earmarked funds can cause distortions at the country level in terms of resource allocation, pressure on implementing capacity, increased transaction costs, and mis- alignment with country-owned mechanisms such as PRSPs."35Uganda provides a positive example where external assistance is aligned to the national goals and priorities, as reflected in the PEAP, to ensure aid effectiveness. The government of Uganda has shown consistent leadership in its mode of receiving aid and identification of areas where this aid will be well managed to make significant impact on poverty and economic growth.

When the development objectives of the country and donors overlap, much progress can be made in reporting on results, and more importantly, achieving them. However, when the desires are not congruent, this pressure can lead to results reporting systems with a number of indicators that are not achievable or reflective of country priorities. When these become a condition for disbursement, especially in budget support, there is a potential for decreased predictability and increased volatility, unfocused and perhaps less substantive programs, and erosion of the usefulness of the indicators.

At the agency level, the pressure for measuring progress toward global goals can result in a disconnect between the reporting requirements for donor headquarters and the principles of country ownership and partnership at the country level. This is especially true for bilateral agencies where the pressure from their Parliament or other politicians can result in a reporting system that is focused on attribution to agency programs/projects and in some cases incentives away from harmonization or more programmatic approaches. Due to the governance structure of the multilaterals, pressure from their Executive Boards may work in a similar manner. At the same time many donor agency staff members are being encouraged to manage for results at the operational level; to use and strengthen country systems and draw on those systems for results reporting. This can create a disconnect within the organization. Corporate reporting therefore needs to strike a careful balance between providing a common framework for systematically reporting results and ensuring flexibility in how results are reported in any given context (see Box 4).36 Often this is dependent on how well the country office staff has received the corporate messages and applied them innovatively in a country context.

Ultimately donors and country partners are looking at the same results; thus results reporting systems can: (1) supply donors with the reporting for control systems that are fully aligned with country monitoring and reporting mechanisms; and (2) supply donors' Parliaments and general public with knowledge of the national strategy and how support from donor countries is yielding results. The challenge is how to translate the format and the presentation of results to serve the needs for the country, donors' agencies, and Parliament/external stakeholders. This is an area to be further explored.

Less often discussed is how the domestic political aspects of reporting results can influence country ownership and in turn the quality of the underlying systems on which results reporting is based. It should be recognized that what donors are promoting in partner countries in terms of a comprehensive results framework is something few Organization for Economic Cooperation and Development members practice. In addition, the political acceptability of a results framework with a longer term perspective is correlated to the stage of development of the political system. It is necessary to understand the political context and how efforts to achieve long term goals (such as universal primary education) can yield, or impede, short term political gains. Policy making is likely to have a perspective that empha- Page 41sizes short term benefits. Thus, supporting governments in developing a sufficient basis for results reporting must respond to these political dimensions and build political ownership. Above all, results approaches must illustrate their usefulness for achieving advances that are politically valuable in the short term, e.g. an electoral cycle. Otherwise, there are risks that politicians will develop alternative efforts that are not consistent with the medium term policy outcomes and, thus, undermine implementation of a strategy. There are several studies that highlight the importance of engagement of Parliament and the cabinet in implementation of MTEF.37 Additional reviews have pointed out that sustainable monitoring and evaluation systems require that more attention be paid to the political dimensions and use of information in the policy process, rather than just technical considerations.38 Yet, according to emerging evidence in some countries, there is a rapidly increasing number of policy measures aimed at short-term political gain, with weak sometimes contradicting links to strategy objectives.39 Therefore, innovative solutions at the country level are needed to inform how to develop results reporting systems that are responsive to political needs.

A key driver for harmonization around results reporting has been earlier efforts to improve aid coordination, increase the relevance of donor programs to country owned goals, and reach agreement on the use of common processes and procedures. Strong government leadership has been essential in engaging external partners in a continuous and successful dialogue focused on making development assistance more effective. For instance, clarity on government priorities and expected results and trade offs made by governments can facilitate donor acceptance of less than total agreement on all strategy dimensions, while still aligning to the country priorities. The 2005 Comprehensive Development Framework ("CDF") evaluation noted that this strong government leadership resulted in "development assistance agencies [being] more likely to align their support with country priorities, harmonize their working methods with the country's systems, and avoid supporting overlapping, competing, or non-priority efforts."40

The translation of international commitments, such as the Paris Declaration, from agency headquarters and to field staff is a prerequisite for fostering a conducive environment for country results reporting that respects country ownership and priorities. A recent DAC survey noted that donors are basing their assistance on country priorities, particularly where there is ownership on the part of the country, government capacity to lead the donors, and commitment to work differently by donors. This type of alignment to government priorities is a first step in establishing a similar results reporting system. How this is done in practice varies depending on the country circumstances. In Mozambique, the World Bank country team responsible for piloting a results based country assistance strategy ("CAS") used the opportunity of a new corporate initiative - piloting the results based CAS as part of the results agenda - to discuss with the Government and other donors how to harmonize around a common set of indicators at the strategy level. This resulted in a series of discussions among donors and government on the content of the strategy results framework and alignment of this to the Government PARPA. It provided the structure for in-depth examination of goals and expectations from all sides and a discussion of how to monitor and measure those expectations. In many instances, these Page 42 meetings represented the first time that sector representatives had held technical discussions with other ministries or the Ministry of Finance. During this process the World Bank and the Government's team engaged in a process of prioritization and selection and settled for trade-offs that were acceptable to everyone. The team was able to achieve 70 percent alignment of indicators by the time of presentation of the strategy to the World Bank Board, while committing to increase this alignment during implementation. At the same time, the team met guidelines of the corporate reporting system being piloted. This corporate reporting system allowed flexibility in identification of the targets and indicators, while maintaining technical rigor in the method applied to evaluation.

Factor Three: Programmatic Support For Capacity Building In Monitoring And Evaluation

As countries move to a stronger results orientation in their strategy development and planning processes - supported by the donor community - their capacity to do so must grow exponentially. On the donor side, the evolution within agencies to a stronger results orientation in their assistance strategies and increasing reliance on country monitoring systems further underlines the importance of coordination for capacity building. This is coupled with the probability of budget support playing a much larger role in aid disbursements underpinning the need to strengthen public sector management mechanisms, including project analysis, budgeting, reporting, and M&E. Developing monitoring and evaluation capacity is a long term process and making incremental progress sustainable requires ownership by government.

Monitoring systems by themselves may not contribute very much to the enhancement of development effectiveness, unless procedures are in place for the results from monitoring to feed into policy making and decision-making processes of governments and donors. Institutionalization of the evaluation function is equally important for harmonization around results, as much of evaluation in country is carried out in a somewhat ad hoc manner. The strategies used to introduce results-based management have varied across countries; however, there are similar elements that contribute to a successful shift to a results-based culture, and well-established strategies to move the results agenda forward. These include:

    - A clear mandate for deepening the results approach within the governance system. This may include the presence of strong leadership, usually through a strong champion(s) at the most senior level of government. It may also be driven by economic pressures or other incentives for change (often, a concerned citizenry or the need to reduce the cost of burdensome civil service payrolls).

- Clear links to budget and other resource allocation decisions. This implies greater interconnectivity between government institutions and more transparent resource management systems.

- A results oriented culture and supporting organizational structures. The culture within countries may not value a focus on results. Agencies may lack sufficient administrative and organizational structures to support using results-based information for planning, management, and resource allocation decisions.

- Involvement of civil society as an important partner with government.

- Pockets of innovation that can serve as beginning practices or pilot programs.

- The capacity to define a national strategy aligned to sector, regional, and local planning is often weak. The move to an increasing role for local governments in service delivery necessitates better linkages to planning and management at lower levels of government - where capacity may be weak.

- The ability to design and maintain supporting statistical systems is weak and there is not an adequate a results-based workforce to develop and support information systems for sustained use. Often, government officials do not have the training or legal frameworks for modern data management to support a results-based management system. In many countries development data are collected by different institutions with little coordination on time periods and statistical methods, thus undermining reliability of results reporting.

There are three actions that the donor community and countries can take to foster country ownership while meeting the need for external reporting. First, the donor community and country can undertake joint assessment of country capacity for monitoring and evaluation capacity that are essential for results reporting. This country capacity should be meaningful in the country context and develop capacity building and includes the identification of relevant and viable indicators, the capacity to organize timely, efficient, "lean" data collection mechanisms, the capacity to assess in a meaningful way what actual observations on those indicators would tell, the capacity to formulate meaningful policy advice on the basis of observed trends, and the capacity to formulate evaluation needs. Second, donors can operationalize joint work and mutual accountability by scaling up M&E systems from project to sector to country level; for example, by supporting the integration of project-level M&E systems into line ministries structures. Third, the donor community and country can sign a memorandum of understanding with development partners on how support for M&E, whether project, program, or institutional, will strengthen country systems in a sustainable manner.

Conclusion

The review of the processes and current environment around harmonization of results reporting reveals that the impetus for building harmonized systems around results reporting is Page 43 not limited to defining indicators and agreeing on measurement. Instead, harmonization around results reporting is part of a broader political and economic context, both in country and in the international community that points to the necessity to balance the external reporting needs with domestic accountability and position results reporting within the country's development agenda, systems, and capacity to deliver evidence of results. The four cases (Uganda, Tanzania, Mozambique, and Madagascar) and an in depth literature and documentation review drove to identification of factors and actions that are needed to make harmonization around results reporting work in practice. Against each action the cases provided examples that can be turned into specific interventions for the development community. These factors consider the interrelationship between the country context and the international context in a way that is supportive of improving country systems.

-------------------------

ENDNOTES: Harmonization Around Results Reporting

[1] International Conference on Financing for Development, Monterrey, Mexico, March 18-22, 2002, available at http://www.un.org/esa/ffd/ffdconf/ (last visited Oct. 29, 2004).

[2] The Second International Roundtable on Managing for Development Results Marrakech, Morocco, Feb. 4-5, 2004, available at http://www.mfdr.org/2ndRoundtable.html (last visited Oct. 29, 2005).

[3] The Paris High Level Forum, France, Feb. 8- March 2, 2005, available at http://www.aidharmonization.org/ah-wh/secondary-pages/Paris2005 (last visited Oct. 29, 2005).

[4] G8 Gleneagles 2005, Perthshire, Scotland, July 6-8, 2005, available at http://www.g8.gov.uk/servlet/Front?pagename=OpenMarket/Xcelerate/Sh owPage&c=Page&cid=1078995902703 (last visited Oct. 29, 2005).

[5] THE WORLD BANK, GLOBAL MONITORING REPORT, 2004, available at http://siteresources.worldbank.org/GLOBALMONITORINGEXT/ Resources/0821358596.pdf (last visited Oct. 29, 2005).

[6] On a whole, there has been relatively solid growth performance in low- income countries over the past ten years, although high population growth rates in low-income countries temper the per capita gains. These aggregate figures disguise substantial variation across countries and volatility over time (World Bank Review of PRSP implementation 2005 forthcoming).

[7] IEO/OED PRSP Review, 2004. Capacity Building in Africa, An Operations Evaluation Department Review of World Bank Support, Operations Evaluation Department, 2005.

[8] 2005 REVIEW OF POVERTY REDUCTION STRATEGY APPROACH: BALANCING ACCOUNTABILITIES AND SCALING UP RESULTS, WORLD BANK, IMF, 2005.

[9] Countries start moving in that direction as they believe that managing for development results ("MfDR") may lead to a more effective and efficient public sector management system and process.

[10] THE WORLD BANK, PROJECT APPRAISAL DOCUMENT, THIRD ENVIRONMENT PROGRAM SUPPORT PROJECT (Apr. 2004).

[11] THE WORLD BANK, PROGRAM DOCUMENT, SECOND ENVIRONMENT PROGRAM (Oct. 1996).

[12] THE WORLD BANK, COUNTRY ASSISTANCE STRATEGY (1997).

[13] Poverty reduction as a key objective of government policy in Mozambique predates the introduction of the PRSP, dating back to the mid-nineties with a succession of Policy Framework Papers.

[14] THE WORLD BANK, OPERATIONS EVALUATION DEPARTMENT (OED), CAPACITY BUILDING IN AFRICA: AN OED EVALUATION OF WORLD BANK SUPPORT, (2005), available at http://lnweb18.worldbank.org/oed/oeddoclib.nsf/24cc3bb1f94ae11c85256808006a0046/5676a297fe57caf685256fd d00692e32/$FILE/africa_capacity_building.pdf (last viewed Oct. 31, 2005).

[15] This movement to increased donor coordination started in the 1990s and was formalized in 2000 with the establishment of the Joint Donor Programme for Macro-Financial Support ("JP"), underpinned by an MOU known as the Joint Agreement ("JA"). This agreement provided a common focus and allowed for regular dialogue and an annual review, known as the Joint Donor Review ("JDR"), between signatories. And, over this period, the donor group grew to the G-15 by 2004.

[16] With the budget support aid modality, donors essentially give up the direct control over resource use that is typical of "project" support. In its place, there is then need for some combination of two factors: the supportive structures (including M&E systems) to ensure that the use of these resources meets the accountability requirements of donors, and, second, a trust and willingness on the part of donors to accept some level of risk regarding the use of those funds by the government. Clearly, with an improved PMS, there is greater ability to measure and monitor results and, subsequently, a greater trust/lower risk in relying on government systems to measure and monitor performance. This aid modality thus serves to work in support of donor use of government systems.

[17] INTERNATIONAL MONETARY FUND AND INTERNATIONAL DEVELOPMENT ASSOCIATION, POVERTY REDUCTION STRATEGY PAPER-PROGRESS REPORT, Joint Staff Assessment, (2003).

[18] Three-quarters of external project assistance distributed outside the budget.

[19] The government took several important steps towards harmonization and alignment to enhance aid effectiveness. These included the establishment of: The Independent Monitoring Group, the formulation of a Tanzanian Assistance Strategy in 2002, a Poverty Reduction Strategy, a Poverty Reduction Budget Support Facility, (with a common Performance Assessment Framework), and a Tanzanian Joint Assistance Strategy.

[20] Norway was the first bilateral to commit all its foreign aid through the budget.

[21] David Booth, Poverty Monitoring Systems: An Analysis of Institutional Arrangements in Tanzania, Overseas Dev. Inst., Working Paper 247 (Mar. 2005), available at http://www.odi.org.uk/publications/working_papers/ wp247.pdf (last visited Oct. 31, 2005).

[22] The focus on poverty and reforms began in 1986, with a CG meeting in 1995 that started discussions on a Poverty Eradication Action Plan (PEAP) 1996/1997.

[23] THE WORLD BANK, TOWARD COUNTRY-LED DEVELOPMENT: A MULTI- PARTNER EVALUATION OF THE COMPREHENSIVE DEVELOPMENT FRAMEWORK, Uganda Case Study, (2003) available at http://lnweb18.worldbank.org/ oed/oeddoclib.nsf/DocUNIDViewForJavaSearch/4AC98B40DB95F73B8 5256DAC0059775D/$file/country_case_studies.pdf (last visited Oct. 31, 2005).

[24] David Booth and Henry Lucas, Desk Study of Good Practice in the Development of PRSP Indicators and Monitoring Systems, Overseas Development Institute (2001) available at http://siteresources.worldbank.org/INTPRS1/Resources/PRSP-Review/odi.pdf (last visited Oct. 31, 2005).

[25] It is difficult to attribute short term policy actions to improvements in impact indicators, and many of these indicators change too slowly to be useful for shorter term results reporting or annual decision making.

[26] These findings were supported by a 2004 review by OED and IMF evaluation departments which found that the results orientation was the weakest part of the PRSPs reviewed.

[27] David Booth and Xavier Nsabagasani, Poverty Monitoring Systems: An Analysis of Institutional Arrangements in Uganda, Working Paper 246, Overseas Dev. Inst. (Mar. 2005), available at http://www.odi.org.uk/publications/working_papers/wp246.pdf (last visited Oct. 31, 2005).

[28] This is recognized in the case of Tanzania where the MoU states that the donors will work to align the PAF and PRS.

[29] Id.

[30] THE WORLD BANK, REVIEW OF WORLD BANK CONDITIONALITY (Sept. 2005), available at http://siteresources.worldbank.org/PROJECTS/ Resources/40940-1114615847489/webConditionalitysept05.pdf (last visited Oct. 30, 2005). The conditionality review also pointed out the difficulty in linking disbursement volumes directly to outcome indicators. It stated that this is hampered by a number of problems: unavailability of suitable short-term outcome indicators (e.g., for public finance management and private sector development), substantial time lags in data availability, unreliability of data, and the risk of penalizing governments for outcomes that are outside their control. Formulaic application of outcome-based conditionality would also reduce the flexibility and adaptability of the programmatic approach. An increased results orientation should therefore be based on an appropriate mix of performance and policy indicators, and development practitioners generally agree that these indicators should be identified and tracked regularly as part of overall evaluation frameworks. At the same time, country experience and practical concerns suggest that the use of outcome-based indicators as conditions for disbursement should be approached with caution.

[31] The World Bank, id. at 17.

[32] WORLD BANK, OPERATIONS EVALUATION DEPARTMENT, THE POVERTY REDUCTION STRATEGY INITATIVE: AN INDEPENDENT EVALUATION OF THE WORLD BANK'S SUPPORT THROUGH 2003, 17 (2004), available at http://lnweb18.worldbank.org/oed/oeddoclib.nsf/24cc3bb1f94ae11c85256 808006a0046/6b5669f816a60aaf85256ec1006346ac/$FILE/PRSP_Evalua tion.pdf (last visited Oct. 30, 2005).

[33] Note by Guy Razafindralambo and Lisa Gaylord, Co-President, EP3 Joint Steering Committee (Environment/Rural Development Team Leader, USAID/Madagascar).

[34] G8 Glenagles Summit with additional financial commitments and debt relief matched by steps to improve development effectiveness.

[35] The WORLD BANK, AID FINANCING AND AID EFFECTIVENESS 9 (Sept. 2005), available at http://siteresources.worldbank.org/DEVCOMMINT/ Documentation/20651878/DC2005-0020(E)-AidFin.pdf (last visited Oct. 30, 2005).

[36] USAID has some of the longest experience in this area, but experience also points to the danger that the system can become too top-heavy, too costly and time consuming. Operating units and implementing partners in USAID have argued that reporting can compete with the time spent implementing and that reporting upwards for higher-order results reporting is not directly relevant to them (OECD/DAC 2000 p23).

[37] Garnett (2004); Holmes (2003); Andrews and Moon (2004); Piron and Evans (2004).

[38] See generally, THE WORLD BANK, OPERATIONS EVALUATION DEPARTMENT, EVALUATION CAPACITY DEVELOPMENT, at http://www.worldbank.org/oec/ecd (last visited Nov. 8, 2005).

[39] Examples include the suspension of Graduated Tax, the cotton subsidy, the "Wealth for All" plan, Agricultural Zoning, and the school-feeding program.

[40] THE WORLD BANK, 2005 COMPREHENSIVE DEVELOPMENT (2005).

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT