- The Strategist - https://www.aspistrategist.org.au -

Drones and the kill-decision-making loop

Posted By on April 4, 2013 @ 11:30

 MQ-9 Reaper unmanned aerial vehicle [1]

Globally, state use of armed unmanned aerial vehicles (UAV aka ‘drone’) technology has gone ahead in leaps and bounds in recent years as they have provided significant advantages in counter-terrorism and warfare. The US, for example, has clearly established itself as the most prolific user of drone technology with such success against al Qaeda that the group recently developed a manual [2] on how to avoid drone strikes. At the 2013 Australian International Airshow, Minister for Defence Stephen Smith indicated that armed UAVs might have a role in the ADF in the future [3] and has called for a debate. He’s right in identifying the need for this conversation to take place, especially given that it is such a rapidly evolving and highly utilised global technology. An important part of any debate should be about what comes next: autonomous killer drones.

By some estimates [4], fully autonomous systems might be as close as five years from now. This will depend upon the pace of innovation, societal acceptance and the security requirements of states, but also on how quickly we progress toward what computer scientists call singularity [5]— the point when the power of computers exceeds the power of human brains.

When it comes to more complex autonomous systems such as drones, a key question becomes, how ‘autonomous’ do we want them, especially when it concerns targeting with lethal force? The United States has already begun a debate about autonomous lethal systems. Having the Australian debate now will enable us to shape the future development of drone technology and avoid some of the potential mistakes that could be made.

In November of 2012, Human Rights Watch (HRW) released ‘Losing humanity: the case against killer robots [6]’, a report outlining the organisation’s concerns about fully autonomous weapons such as drones and the inherent risks associated with completely removing a human from the kill-decision-making loop. Decision-making on drone strikes, even with multiple human inputs, has already been a particularly contentious issue due to concerns about civilian casualties. The report’s basic premise is that the risks associated with fully autonomous lethal systems (ie, no human in the decision-making loop) far outweigh the benefits and, as such, they should be banned before they are created.

On the other hand, blogger Dan Trombly asserts [7] that a fully autonomous system is impossible because no human-made machine will be exempt from the command structure. He also argues that the last thing a military commander would want is a weapon that they have no control over.

Both the HRW report and Trombly focused on the use of such technology by states, which limited the scope of the discussion. Even if states (and the ADF) don’t acquire these systems, they might have to defend against their use by non-state actors (NSAs).Australia, like many other nations, is bound by a set of legal rules when engaged in conflict. Regardless of the combatant they face, or the technology the combatant uses, this won’t change. But NSAs will not be bound in the same way, will not agree to ‘fight fair’ and are most likely to modify and deploy systems with more autonomy and less regulation than recognised states to achieve their ends.

According to a recent Peter Singer TED talk on military robots and the future of war [8], the parts required to build a hand-held Raven drone are freely available on the public market and it can be built for about $1000. Technology is getting cheaper, more advanced and more accessible. Therefore, it is only logical to presume that the links between non-state actors and UAVS will continue to grow both [9] here and abroad.

An NSA could hypothetically access rudimentary open-source facial and voice recognition technology with a homemade drone [10], which could be weaponised with instructions available on the internet, and then programmed to lock on to a Google maps location from a target’s geotag on Facebook or Twitter. Perhaps it seems far-fetched, but Hamas—a recognised terrorist organisation—demonstrated enough rudimentary success in the DIY drone arena to warrant an Israeli military response [11].

In the above scenario, NSAs could literally ‘launch and forget’. Finding out if they were successful in their attack would be merely a matter of monitoring social networking and media sites, taking credit if the plot worked, denying it if it didn’t and condemning it if it suited them.

If we shift from NSA collectives to individuals, or lone wolf types, the gun-toting drone [12] ‘Milo Danger’ built in the US with commercially available equipment is a prime example of how easily open-source technology can be accessed and modified for potentially malevolent use. What of the potential difficulty for protective service people in defending a senior politician or foreign dignitary against an assassination attempt using such a device? Thankfully, Danger used a paintball gun rather than a real one and his friends volunteered to be shot.

Australia has entered the Drone Age [13] and the defence and security commentariat discussing it should think beyond how many drones we might require and what particular radar, missile or camera-type they should be fitted with. It’s time to start debating the levels of acceptable autonomy and how to coordinate state action against non-state actors who maliciously deploy armed autonomous drones.

Perhaps a starting point for the debate could be considering what constitutes ‘autonomy’ or what a constitutes a ‘combatant’ in the armed autonomous drone scenario above—something my colleague Chloe Diggins pondered previously on The Strategist [14] with regard to Twitter users. How far up the production stream does one travel to prosecute (or counter-attack)? Is a combatant the person who launches, houses or maintains an armed drone? Uploads the video to YouTube? What about the individual who builds it without knowing its intent or the programmer who has written the script far away from the battlefield? Yes, a debate is worth having.

Clint Arizmendi is a research & analysis officer at the Australian Army’s Land Warfare Studies Centre. The views expressed are his own and do not reflect those of the Australian Department of Defence or the Australian Government. Image courtesy of U.S. Air Force. [15]



Article printed from The Strategist: https://www.aspistrategist.org.au

URL to article: https://www.aspistrategist.org.au/drones-and-the-kill-decision-making-loop/

URLs in this post:

[1] Image: http://www.aspistrategist.org.au/wp-content/uploads/2013/04/2566048938_581bba522f_z.jpg

[2] manual: http://www.sbs.com.au/news/article/1740682/Al-Qaeda-manual-how-to-avoid-drone-strikes

[3] armed UAVs might have a role in the ADF in the future: http://www.afr.com/p/national/debate_takes_off_over_use_of_armed_c4Nbb2uGX7NvoF4jF7ONEO

[4] estimates: http://killerapps.foreignpolicy.com/posts/2012/11/19/the_campaign_to_ban_killer_robots_is_here

[5] singularity: http://en.wikipedia.org/wiki/Technological_singularity

[6] Losing humanity: the case against killer robots: http://www.hrw.org/reports/2012/11/19/losing-humanity-0

[7] blogger Dan Trombly asserts: http://www.cnas.org/blogs/abumuqawama/2012/11/rage-against-machines.html

[8] TED talk on military robots and the future of war: http://www.ted.com/talks/pw_singer_on_robots_of_war.html

[9] continue to grow both: http://www.abc.net.au/news/2013-03-01/drones-set-for-large-scale-commercial-take-off/4546556

[10] homemade drone: http://www.lowyinterpreter.org/post/2012/09/26/Need-a-drone-Why-not-print-one.aspx

[11] response: http://www.cbsnews.com/8301-202_162-57551216/israel-says-it-knocked-out-hamas-drone-program/

[12] gun-toting drone: http://www.aclu.org/blog/technology-and-liberty-national-security/diy-armed-drone

[13] Drone Age: http://www.globalpost.com/special-reports/the-drone-age-why-we-should-fear-global-proliferation-uavs

[14] previously on The Strategist: http://www.aspistrategist.org.au/are-social-media-users-now-legitimate-targets/

[15] U.S. Air Force.: http://www.af.mil/photos/mediagallery.asp?galleryID=5368

Copyright © 2024 The Strategist. All rights reserved.