If the venerable British naturalist David Attenborough was to make a television series entitled The life cycle of the Australian defence review, he would say that reviews are to Defence what the Gnu or Wildebeest is to the African veldt. These animals are not in danger of extinction; they move in herds along predictable courses; they’re none too bright but can kick hard. At certain times of the year they span the horizon, the earth trembling as they pass.
Close observation of the Australian defence review shows it has an eight-stage life cycle. Each review is special in its own way, but evolution establishes herd behaviour—having seen one Gnu you pretty much know what the next Gnu will be like.
Reviews are conceived largely in two ways; either by Oppositions annoyed at not being in charge or by governments facing unhappy crises. Examples of the first method include the First Principles Review, an election promise of the then-opposition before being elected in 2013. Labor in opposition before 2007 made a similar commitment which ultimately became the Pappas review. Oppositions can’t do much other than plan what they will do in government. Reviews are easy to announce, sound big and decisive and don’t need to be actioned until after an election.
In government reviews are often a way to deal with a crisis. Recall the sad case of Private Jake Kovco who shot himself in Baghdad 2006. The body of another unfortunate individual was mistakenly repatriated to Australia and Defence Minister Brendan Nelson made some public comments about the circumstances of Kovco’s death based on inaccurate advice. The civilian part of Defence had nothing to do with any of this but a furious minister then launched the Proust review into ‘organisational efficiency and effectiveness.’ Crises begat reviews, but the offspring don’t have to look like the parents.
In the review gestation period, terms of reference are written, external teams assemble and start to interact with Defence as they develop their thinking. I recall one group telling a politely non-committal Defence Committee how they would be shaken to their very core, such were the savings and efficiencies to be proposed. However, after the near-continuous use of organisational reviews—36 of them from Tange in 1973 to Peever in 2015—it’s difficult to come up with measures that haven’t been tried or considered before. Incrementalism is often the result. Smart reviewers engage in a detailed conversation with Defence and will adapt their recommendations after testing their value with experienced officials. As a general rule, the less interactive the review process the less implementable its recommendations will be. Even clever reviews done in isolation don’t create the support needed to be delivered.
Stage three of the life cycle is the review’s birth. Although White Papers are a different breed of cat—the magnificent lions of the policy veldt—it’s hard to go past the 2009 White Paper’s launch on the deck of a warship, Defence leaders assembled behind the Minister and 83 press releases proclaiming it to be the whitest of all White Papers. The media’s instant judgement can be cruel, for example the recent RAND review of shipbuilding took a hit because it didn’t include submarines. That’s a little unfair because it’s a different industry in key ways. It points to the need for governments to try to shape expectations for what reviews can deliver. A review’s launch is also often the last time it will receive much publicity as the challenge then becomes implementation by insiders.
In the first years of a review’s life, a lucky group of officials gets to design an implementation plan, the wider Defence Organisation is introduced to their new ‘forever friend’ and the Department gets to grips with making it happen. A reporting system will be developed for ministers. Here, reality meets recommendations. It can emerge, for example, that some recommendations may be well meaning but cannot be made to work—external review teams can’t be expected to understand everything about how the Defence machine ticks. The first practical departures from implementing recommendations to the letter emerge. New bits of bespoke design are gerry-built to cover gaps. Savings may be booked but only time will show if they’ll actually be delivered. None of this is necessarily bad; it’s just what happens when you apply broad-brush fixes to real-world problems.
In maturity the best outcome is when a review becomes normal Defence business. Quick implementation of simple recommendations makes it look as though change is underway. Sensible changes will generate support in the workplace and benefits flow to policy makers and implementers alike. In a year or two entrenched opponents of review recommendations will either have moved or been won over. The review team’s regular reporting to ministers will largely be comprised of green traffic lights for recommendations ‘implemented’ or ‘underway.’ Hopefully there will be few amber or red lights pointing to problem areas, where the most difficult and consequential review recommendations usually reside.
Of course this describes situations where recommendations are accepted and easily implemented. The reality is often a much harder slog. So we come to the end of the first episode of The life cycle of the Australian defence review. In episode two we’ll see how reviews age and sadly die. And we’ll ask if there’s a life beyond for reviews that have slipped their earthly bonds.