Does merging improve aid efficiency?

Canada maple leaf (image: Flickr/Doug)

In foreign aid, ‘efficiency’ (which is distinct from ‘effectiveness’) usually refers to the costs associated with administrating aid programs, that is, the costs of running aid agencies and activities related to ODA programming and delivery. Despite being necessary for operating an aid agency, administrative costs are frequently referred to as a negative function of ODA that donors seek to reduce. In Canada, for example, the 2007 Budget listed “improving efficiency through reduced administrative costs…” as a key way to improve the effectiveness of Canadian aid.

The 2013 merger of the Canadian International Development Agency (CIDA) with the Department of Foreign Affairs and International Trade (DFAIT) to form the Department of Foreign Affairs, Trade and Development (DFATD) led some to speculate that the reorganisation would spark efficiency gains for the government by reducing the duplication of efforts and cutting administrative costs. While efficiency gains were not the main or official justification for the merger, which was said to improve the coherence of Canada’s foreign policy, some observers at the time remarked that job cuts and efficiency gains would be “hard to avoid”.

Greater efficiency in Canadian aid?

Over the past decade, Canada has reduced the cost of aid administration considerably. In 2003, Canada’s administrative spending peaked at 9.9% of ODA. That same year, the average administrative costs of OECD-DAC donors was 4.8%, giving Canada one of the most costly aid programs in the world. Over the decade that followed, Canada reduced its cost of aid administration by almost half, reaching a low of 4.9% in 2012 prior to the merger. (All data are from OECD.Stat database. See Total Flows by Donor dataset.)

To date, however, there is little evidence to suggest that the merger of CIDA and DFAIT meaningfully changed the administrative efficiency of Canada’s aid program. From 2012 to 2014, Canada’s administrative costs increased from 4.9% to 5.6% as a proportion of the ODA budget. In large part, this increase was due to a reduction in the size of Canada’s ODA budget, which shrank from $5.5 billion to $4.4 billion over the same period, rather than any real change in the value of administrative spending. While the dollar amount that Canada spent on administrative spending fell slightly over this period, from $277.7 million in 2012 to $259.7 million by 2014, the small reduction in administrative costs was outweighed by the large budget contraction that caused spending to increase proportionally. (All data are from OECD.Stat database. See Total Flows by Donor dataset. There may be slight differences between OECD data and country data as the former are on calendar basis while the later, for example in the case of Canada, are on fiscal year basis.)

The Canadian case stands in marked contrast to Australia’s merger experience. In Australia, administrative spending fell by more than 30% following the 2013 merger of the Australian Agency for International Development (AusAID) with Australia’s foreign ministry, DFAT. From 2012-2014, Australia’s costs for aid administration fell from 6.1% of ODA, to 3.9% by 2014.

It may still be too early to understand the full fall-out of the merger in either case. One question that remains is whether spending cuts and organisational changes will translate into actual efficiencies, or whether, at least in the Australian case, the program is simply becoming cheaper, and if so why.

For Canada, the marginal change in administrative spending may increase pressure on the new DFTAD to show improvements in the coherence of Canada’s foreign policy. After all, without tangible changes to administrative costs, available data suggests that the new structure may be no more efficient.

Are autonomous aid agencies more expensive?

There is little evidence to suggest that autonomous aid agencies are less efficient than other models.

Prior to 2013, Canada, Australia, and the UK each had autonomous foreign aid agencies. With Canada and Australia merging their aid programs with foreign ministries in 2013, the UK is now the only OECD donor that organises aid in a separate government department.

From 2000-2014, the UK spent, on average, 4.3% of ODA on administrative costs, slightly below the OECD-DAC average of 4.6% over the same period. At its peak in 2003, UK administrative costs reached 7.4%. By 2014, these costs had fallen to 2.2%, making the UK’s Department for International Development (DFID) one of the most administratively efficient aid agencies among its peers.

As much as the UK case shows that autonomy can be efficient, the Australian case suggests that autonomy may not be efficient in all cases. This tells us two things.

Firstly, the idea that an autonomous aid agency is less efficient than a merged structure appears untrue.  Indeed, the UK example shows that autonomous agencies are capable of operating at low administrative costs. Combined with DFID’s reputation for high-quality and innovative aid programming, the UK’s experience suggests that autonomous agencies are not a less ‘optimal’ model than other forms of aid organisation.

Secondly, in the cases discussed, differences in the administrative efficiency of the three autonomous donor agencies suggests that efficiency may be more closely linked to donor-specific factors than to their choice in model. In other words, donors may be more or less administratively efficient with any given model, as long as it meets and is suited to their individual contexts and constraints (similar findings have been discussed by Gulrajani 2012 in Struggling for Effectiveness: CIDA and Canadian Foreign Aid).

Although the discussion here has focused on administrative costs, with foreign aid it is important to recognise that administrative efficiency is not the same as effectiveness. The purpose of aid, regardless of organisation, is to reduce poverty. Administrative efficiency is pointless if aid programs are unable to fulfill their core mandate.

Rachael Calleja is a Senior Research Analyst at the Canadian International Development Platform and a PhD Candidate at the Norman Paterson School of International Affairs, Carleton University. 

This blog is cross-posted with permission from the Canadian International Development Platform (CIDP).

image_pdfDownload PDF

Rachael Calleja

Rachael Calleja is a Senior Research Analyst at the Canadian International Development Platform and a PhD Candidate at the Norman Paterson School of International Affairs, Carleton University.

6 Comments

  • Further to my last comment, I note I have overlooked the blindingly obvious: the key output against admin costs is expenditure of the ODA budget. Yet, I don’t find this a satisfactory result measure to assess efficiency against. The most efficient way to spend aid would then be to simply give it all to the UN in a single transfer (for example). This could involve an extraordinarily low admin cost for a donor ODA programme. But I don’t think many people would think this was necessarily an efficient way to spend donor taxpayers’ money (although, one could argue strongly that it is). There has to be a better result against which to assess efficiency.

  • A thought-provoking post, thank you Rachael. In the NZ case the movement of the semi-autonomous NZAID back into the Ministry of Foreign Affairs and Trade was partly touted to be more efficient. Yet administrative costs remain reasonably static (they even went up initially). NZ may be different to other countries, however, as it has strict rules about what can be allocated to administrative functions and what can’t. As you argue, there doesn’t seem to be much difference in administrative costs according to a separate or internal ODA programme. However, I think further differentiation is necessary, as there are two models of ‘integration’ – models one and two, which involve different degrees of integration and there may be different admin costs associated with each.

    There is a broader question here, also. I think the use of administration costs as a proportion of overall ODA misses the point. I know it is the standard approach donors use to measure so-called efficiency but it doesn’t say much. It is also easy to game, as the questions Ben raises highlight. I would put efficiency numbers alongside tied aid numbers as some of the most unreliable data DAC has.

    DAC’s own definition of efficiency might be useful to apply here: “Efficiency measures the outputs — qualitative and quantitative — in relation to the inputs. It is an economic term which signifies that the aid uses the least costly resources possible in order to achieve the desired results. This generally requires comparing alternative approaches to achieving the same outputs, to see whether the most efficient process has been adopted.” This is a useful definition because it assesses the costs against what is achieved. So to actually measure efficiency in an aid programme, one could use the administrative cost but then gauge it against some high-level output or outcome achievements.

    Anyone know of any agencies that do this? Or have some ideas for what such an efficiency indicator might look like?

  • From 2005 to 2010 the AusAID administrative cost averaged 3.9% of ODA (see DAC Online Table 1). The increase in admin ratio from 2011 to 2013 was a sensible build-up in capacity to effectively use the planned increases in ODA volume. It is not surprising that the ratio has now dropped by 30% given the severe cuts to aid implemented by the Abbott Government.

  • Good read. It would also be interesting to know if the lower administrative burden inside the aid agencies has had any impacts on the amount of administration being done by implementing partners (and subsequent funding they receive to do this work). In other words, has the merger just shifted the administrative burden to the implementing partners thus painting a crude image of ‘efficiency’? The other area which warrants some more analysis is whether the merger also impacts the modalities and types of aid funding (ie. bigger programs/ less contracts) and what this means for the amount of oversight and type of work being done by aid managers inside the organisation. A subsequent question might focus on what is the right amount of aid management and technical expertise within the aid agencies and whether there is a good equilibrium to be reached whether in an autonomous or merged Department. The answers to these questions will have important implications for aid effectiveness also.

  • “The UK is now the only OECD donor that organises aid in a separate government department.” What about Japan with JICA? JICA is referred to as a ‘quasi-government’ organization; it is a separate aid agency but MOFA also delivers aid programs.

    • The classification of the UK as the only remaining autonomous aid agency is based on the OECD’s 2009 Managing Aid report, which identifies four separate models of aid organization used by OECD-DAC donors. In this report, the UK is identified as a “Model 4” (where a ministry or agency, that is not the ministry of foreign affairs, is responsible for policy and implementation), while Japan is classified as a “Model 3” (where a ministry is responsible for policy formation and separate executing agency is responsible for policy implementation). While you’re right that JICA is a separate agency, my understanding is that the key difference in the classification is based on function – while DFID is responsible for both policy formation and implementation, JICA acts as an implementing agency for MOFA policy.

Leave a Comment