Defence sustainment: low profile, but very important
13 Jul 2017|

Defence spent $8.3 billion on sustaining its equipment in 2016–17, compared with $9.3 billion on new capital equipment. But most of the words written on defence procurement (including mine) are about acquisition projects. Thankfully the Australian National Audit Office decided to take a look at the decidedly unfashionable world of defence materiel sustainment in the performance audit that hit the streets this week. As is usually the case, there’s a mixture of low-hanging fruit—the sprawling Department of Defence can always be counted on for a free kick or two—and more considered, deeper analysis of the department’s processes and structures.

The overall message is fairly reassuring: the auditors describe the fundamentals of Defence’s framework for managing materiel sustainment as ‘fit for purpose’. But they also note that there’s scope for Defence ‘to improve its performance monitoring, reporting and evaluation activities to better support the management and external scrutiny of materiel sustainment’.

I agree about there being scope for better and clearer reporting. The last time the department reported detailed sustainment costs for the full suite of outputs was in an extensive series of tables in its 2006–07 annual report. That information was invaluable for those of us tasked with providing well-informed advice about defence capability planning and management. For example, the force structure costings in ASPI’s 2008 Strategic choices paper were based on those tables. And a comparison of sustainment spending on Collins submarines and Anzac frigates provided an early indicator that all was not well in the maintenance of our submarines.

In recent years, public information has become more aggregated, sparser and less revealing, despite Defence having significantly better internal data management mechanisms than was the case a decade ago.

A clear example of problems in reporting the efficiency and effectiveness of sustainment activities is Defence’s quarterly performance reports. As the name suggests, the reports cover a three-month period, yet they take two months to produce. And, worse, in the ANAO’s view they’re ‘neither complete nor reliable’ and don’t include information Defence possesses that would be helpful in providing the reader with a clearer picture.

The audit report includes a case study to illuminate the problems. Annex 3 is the April–June 2016 quarterly sustainment report for the perennially low-hanging Tiger armed reconnaissance helicopter. The figures show us that 100% of the annual sustainment budget of $134 million was expended, but only 68% of planned flying hours were achieved. Yet the report’s ‘traffic lights’ were either green or amber, and the reader could be excused for thinking things were tracking well enough. The ANAO observes that the problem is that the report’s traffic lights

focus attention on the measures contained in contractual and intra-Defence agreements (for example, aircraft ‘availability’) and not the measures that matter to Army, the end-user (aircraft actually able to be flown).

It’s all a little reminiscent of Yes, Minister’s efficient National Health Service hospitals, which met all sorts of performance indicators—except for treating sick people.

The audit notes that Defence’s ‘sustainment gate reviews’—a relatively recent practice—allow management attention to be drawn to issues that might otherwise escape notice. However, the output of those reviews doesn’t find its way into the quarterly reports. Joining up those activities would seem to be an easy improvement to make. The ANAO recommends that Defence institute a ‘risk-based quality assurance process’ for information included in the quarterly performance report. It’s just to be hoped that the new step doesn’t take more than an extra month to complete …

This performance audit was completed very early in the life of the restructuring of the department and its processes following the First Principles Review. The picture that emerges of those changes is fairly positive, consistent with other observations. But there are also a couple of cautions. First, there’s always a danger that progress from previous reforms will be lost when new priorities are identified. The ANAO points to Defence’s work towards an improved asset management system as one such endangered reform. Second, the ANAO is wholly unconvinced—in the sense that the department couldn’t produce the data—that previous reforms had led to the substantial savings that have been claimed. On ‘Smart Sustainment’, the report says this:

Smart Sustainment had a ten-year savings target of $5.5 billion. In its 2014–15 Annual Report, Defence claimed to have achieved $2 billion of savings from the initiative in its first five years. Defence has not been able to provide the ANAO with adequate evidence to support this claim, nor an account of how $360 million allocated as ‘seed funding’ for Smart Sustainment initiatives was used.

Without clear metrics of success, and accurate data capture and reporting, there’s a risk that a future audit of the implementation of recent changes will be similarly unable to reach any firm conclusions.