At present, Defence is executing around 180 major capital investment projects with an aggregate value in excess of $123 billion. Consistent with what’s at stake, defence acquisition is subject to annual audit and periodic external review (including in 2003, 2008, 2012 and 2013). Most recently the 2015 First Principles Review, recommended sweeping changes to the planning and execution of defence projects.
With such intense scrutiny, you’d expect clear and unambiguous reporting on the financial aspects of defence projects. But while reporting has improved substantially over the past decade, clarity remains elusive. Extraordinarily, it’s often not possible to say with any confidence whether a project has been delivered within planned financial resources or not—even at the conclusion of the project. This includes the statutory annual reporting by Defence and the ANAO.
The central problem is that Defence’s indexation of project budgets conceals the true state of affairs. To make matters worse, the indexation regime switched from one type of systematic error to another in mid-2010. God help anyone who wants to understand what’s been going on in a project that straddles both. (Project budgets are also adjusted to take account of their exposure to foreign exchange variations, but this is easy to take into account and need not be considered further.)
Prior to July 2010, defence projects were approved in terms of the dollars of the year of approval. For example, the Collins project was approved with a budget of $3,892 million as at June 1986. Over subsequent years, the unspent budget was adjusted twice-yearly to take account of the changing price of labour and materials. This makes sense: projects are usually liable for the changing price of inputs used by suppliers in contracts. Over the life of the Collins project, the approved project budget grew by $1,229 million as a result of price and foreign exchange movements.
Provided a project ran to schedule all was well, but things got problematic when projects were delayed—a frequent occurrence then as now.
When project activity and payments were pushed into future years, indexation increased the amount of money available in recognition of the falling buying power of the dollar. But because the prices of labour and materials often grow more quickly than CPI inflation, indexation provided an unambiguous real boost (i.e. above inflation) to the project budget. However, no recognition of this fact arises in Defence’s reporting. The real cost of projects grew as the result of delays, and no one was the wiser.
The impact was often substantial. In 2010–11, the ANAO reported that across a mere eight delayed projects, $295 million in indexation (and counting) had been provided post-planned delivery dates. Even if two-thirds of the indexation was pure CPI inflation, there’s still an unacknowledged real cost increase of $100 million paid for by the taxpayer.
The 2009 White Paper moved overall Defence funding onto a fixed price indexation model. Soon after, Defence initiated an internal funding regime based on the one-off application of price indices recommended by the 2009 Pappas Review—including for DMO. Note, however, that there has never been, and still isn’t, any connection between project indexation and overall budget indexation.
In any case, after July 2010, defence projects were approved on the basis of out-turned nominal dollars anticipating price movements via a so-called Specialised Military Equipment (SME) Weighted Average. Already-approved projects were given a one-off out-turned adjustment at the same time. In effect, rather than wait and discover actual price movements using measured econometric indices, assumptions were made about where the indices would be year-by-year into the future. Clearly, guesses about the future are never going to be as good as waiting to see what happens. But setting that obvious point aside, there’s a greater problem; out-turning requires the spending schedule to be known ahead of time.
As before, provided a project runs to schedule things are not too bad. But if the project is delayed, things can get ugly very quickly. As activity and payments are pushed into the future, the project is left short because of the falling buying power of the dollar.
The magnitude of the problem is larger than in the old regime. Prior to 2010, delayed projects received unacknowledged real cost increases proportional to the difference between indexation and CPI inflation. After July 2010, delayed projects suffered a shortfall proportional to the full rate of the indexation. Under the new regime, it’s entirely possible for a project to run out of money even in the absence of a real increase to the cost of a project. Thus, we’ve gone from a regime that protects the guilty to one that punishes the innocent.
No indexation regime based on econometric indices will ever be able to precisely compensate for the actual movements in prices in a particular project. That said; some choices of indices are better than others. Case in point, the SME Weighted Average seems unrealistically optimistic about prices being constrained. But irrespective of the indices employed, if Defence persists in applying indexation in plainly fraught ways, we’ll never be able to tell what’s going on in the $7 billion a year capital equipment program.
It wouldn’t be hard to do properly—it’s a simple matter of arithmetic to track the impact of inflation and the additional costs arising from delays. The result can then be expressed in any chosen year’s dollars with the real cost variation isolated. That’s exactly the approach the Pentagon takes in its Selected Acquisition Reports to meet its statutory reporting obligations. The current Australian practice of expressing project budgets in out-turned dollars is nonsensical because it adds together dollars with different buying powers—the financial equivalent of adding apples and oranges.
It’s really worth getting this right. Consider the AWD project. With the future of ASC and naval shipbuilding hanging in the balance, we need to sort out how much of the cost pressures experienced by the project are real and how much are an artefact of the indexation regime. Moreover, we need the answer in terms of today’s dollars rather than a mishmash of out-turned dollars that gives equal weighting to dollars spent in 2007 and 2020, even though the buying power of the former is expected to be 38% higher than the latter.
If we are serious about improving the performance of defence acquisition projects, the first and essential step is to accurately measure project performance. The current approach fails to do so.