Capability development—still a work in progress (1)
28 Nov 2013| and

microscopeAlthough it went largely unnoticed, at the end of October the Australian National Audit Office (ANAO) released its performance audit of Capability Development Reform in Defence. It has all the weaknesses, strengths and charm we’ve come to expect from the fine folks at the audit office.

Its major weakness is a focus on compliance at the expense of outcomes, but that reflects the nature of performance audits rather than an error on the behalf of the authors. And, by testing Defence’s implementation of the 2003 Kinnaird and 2008 Mortimer reviews of acquisition, much of what matters is covered in any case.

As with prior ANAO audits, the report’s strength comes from the systematic examination of evidence. Time and time again, the audit office has shown that some of the most damning indictments of Defence can be found sitting in that organisation’s own filing cabinets, or by comparing its public pronouncements to its actual performance.

As for charm, this particular report outdoes its predecessors in terms of carefully measured understatement and the tongue in cheek disclosure of facts that speak for themselves. Who’d have thought, for example, that the ANAO would tabulate two pages worth of damning extracts from inter-ministerial correspondence regarding tardy advice from Defence on delays and changes to the scope of projects? It contains such gems as this Ministerial notation: ‘No advice on the progress of this project has been brought to government attention since it was approved over 10 years ago’—on paperwork for a project that was 70 months late.

So what did the auditors conclude? The report is organised around four themes that have recurred in successive reviews of Defence’s capability development activities. At the risk of oversimplifying the report’s 15 chapters and 325 pages, the more interesting conclusions grouped by theme are:

Capability development – organisation and processes

The good news is that Defence has documented its capability development processes and improved its record keeping. That’s nice. The bad news is that Defence’s capability development group continues to be staffed by predominately military personnel with short tenures and limited experience in capability development—despite successive recommendations to the contrary. So long as multi-million dollar projects are conceived and managed by people whose professional training and future careers lie elsewhere, there’s a limit on what can be achieved. Some military expertise on the operational realities of using defence systems is essential, but it’s time we recognised that conceiving, costing and developing defence acquisition proposals are skills in their own right.

There’s also been slow progress in reforming two critical aspects of the capability development process; the entry of projects into the Defence Capability Plan (DCP), and the estimation of personnel and operating costs associated with new capability. Until these matters are resolved we can have no confidence that the multi-billion dollar plans for developing the defence force are either strategically valid or affordable. And there’s a good case to be made that these are related problems—by allowing projects without accurate costings into the DCP (thereby gaining organisational momentum that makes it hard to kill them off later), there’s no reason to expect the plan to represent cost-effective capability planning later.

Improving advice to government when seeking approval

Again, there’s some good news to report. The ANAO found that the assessment of technical risks in projects has improved since the 2009 Pappas Review (PDF), and that the Defence Science and Technology Organisation is providing advice through a mature and well-documented process. Of course, ‘well-documented’ and ‘accurate’ are two different things, and only post mortem reviews of the outcome of projects approved under the new risk assessment process will tell us how successful it has been—something that won’t be possible for some time given the typical timescale of the more complex major projects.

We’re not as convinced as the ANAO about the ability of the Department of Finance to verify Defence’s cost estimates. So while it’s probably a good thing that the provision of information to Finance has improved, we wouldn’t expect it to fix the chronic underestimation of final costs (and schedules, which also have a financial impact) that has long been the hallmark of Defence projects. For complex financial calculations, arm’s length is too far away.

Not for the first time, Defence is seen to be falling short of the recommendations of previous reviews to include a rigorous cost-benefit analysis of its developmental projects against a genuine off-the-shelf (OTS) option. The ANAO observe that the inclusion of an OTS option was met in ‘process terms’ but not in ‘outcome terms’. Simply put—in our words, not the Audit Office’s—Defence too often pays lip service to OTS while pursuing expensive and risky bespoke solutions.

We’ll come back in a later post to discuss the remaining two themes; ‘improving accountability and advice during project implementation’ and ‘reporting on progress with reform’.

Andrew Davies is senior analyst for defence capability at ASPI and executive editor of The Strategist. Mark Thomson is senior analyst for defence economics at ASPI. Image courtesy of Flickr user BWJones.