Drones and the kill-decision-making loop
4 Apr 2013|

 MQ-9 Reaper unmanned aerial vehicle

Globally, state use of armed unmanned aerial vehicles (UAV aka ‘drone’) technology has gone ahead in leaps and bounds in recent years as they have provided significant advantages in counter-terrorism and warfare. The US, for example, has clearly established itself as the most prolific user of drone technology with such success against al Qaeda that the group recently developed a manual on how to avoid drone strikes. At the 2013 Australian International Airshow, Minister for Defence Stephen Smith indicated that armed UAVs might have a role in the ADF in the future and has called for a debate. He’s right in identifying the need for this conversation to take place, especially given that it is such a rapidly evolving and highly utilised global technology. An important part of any debate should be about what comes next: autonomous killer drones.

By some estimates, fully autonomous systems might be as close as five years from now. This will depend upon the pace of innovation, societal acceptance and the security requirements of states, but also on how quickly we progress toward what computer scientists call singularity— the point when the power of computers exceeds the power of human brains.

When it comes to more complex autonomous systems such as drones, a key question becomes, how ‘autonomous’ do we want them, especially when it concerns targeting with lethal force? The United States has already begun a debate about autonomous lethal systems. Having the Australian debate now will enable us to shape the future development of drone technology and avoid some of the potential mistakes that could be made.

In November of 2012, Human Rights Watch (HRW) released ‘Losing humanity: the case against killer robots’, a report outlining the organisation’s concerns about fully autonomous weapons such as drones and the inherent risks associated with completely removing a human from the kill-decision-making loop. Decision-making on drone strikes, even with multiple human inputs, has already been a particularly contentious issue due to concerns about civilian casualties. The report’s basic premise is that the risks associated with fully autonomous lethal systems (ie, no human in the decision-making loop) far outweigh the benefits and, as such, they should be banned before they are created.

On the other hand, blogger Dan Trombly asserts that a fully autonomous system is impossible because no human-made machine will be exempt from the command structure. He also argues that the last thing a military commander would want is a weapon that they have no control over.

Both the HRW report and Trombly focused on the use of such technology by states, which limited the scope of the discussion. Even if states (and the ADF) don’t acquire these systems, they might have to defend against their use by non-state actors (NSAs).Australia, like many other nations, is bound by a set of legal rules when engaged in conflict. Regardless of the combatant they face, or the technology the combatant uses, this won’t change. But NSAs will not be bound in the same way, will not agree to ‘fight fair’ and are most likely to modify and deploy systems with more autonomy and less regulation than recognised states to achieve their ends.

According to a recent Peter Singer TED talk on military robots and the future of war, the parts required to build a hand-held Raven drone are freely available on the public market and it can be built for about $1000. Technology is getting cheaper, more advanced and more accessible. Therefore, it is only logical to presume that the links between non-state actors and UAVS will continue to grow both here and abroad.

An NSA could hypothetically access rudimentary open-source facial and voice recognition technology with a homemade drone, which could be weaponised with instructions available on the internet, and then programmed to lock on to a Google maps location from a target’s geotag on Facebook or Twitter. Perhaps it seems far-fetched, but Hamas—a recognised terrorist organisation—demonstrated enough rudimentary success in the DIY drone arena to warrant an Israeli military response.

In the above scenario, NSAs could literally ‘launch and forget’. Finding out if they were successful in their attack would be merely a matter of monitoring social networking and media sites, taking credit if the plot worked, denying it if it didn’t and condemning it if it suited them.

If we shift from NSA collectives to individuals, or lone wolf types, the gun-toting drone ‘Milo Danger’ built in the US with commercially available equipment is a prime example of how easily open-source technology can be accessed and modified for potentially malevolent use. What of the potential difficulty for protective service people in defending a senior politician or foreign dignitary against an assassination attempt using such a device? Thankfully, Danger used a paintball gun rather than a real one and his friends volunteered to be shot.

Australia has entered the Drone Age and the defence and security commentariat discussing it should think beyond how many drones we might require and what particular radar, missile or camera-type they should be fitted with. It’s time to start debating the levels of acceptable autonomy and how to coordinate state action against non-state actors who maliciously deploy armed autonomous drones.

Perhaps a starting point for the debate could be considering what constitutes ‘autonomy’ or what a constitutes a ‘combatant’ in the armed autonomous drone scenario above—something my colleague Chloe Diggins pondered previously on The Strategist with regard to Twitter users. How far up the production stream does one travel to prosecute (or counter-attack)? Is a combatant the person who launches, houses or maintains an armed drone? Uploads the video to YouTube? What about the individual who builds it without knowing its intent or the programmer who has written the script far away from the battlefield? Yes, a debate is worth having.

Clint Arizmendi is a research & analysis officer at the Australian Army’s Land Warfare Studies Centre. The views expressed are his own and do not reflect those of the Australian Department of Defence or the Australian Government. Image courtesy of U.S. Air Force.