Skip to main content

Are we entering an age of killer robots?

ASU researcher among those clarifying what is, and isn't, possible at UN conference


ASU's Dr. Heather Roff testifies at the UN

Dr. Heather Roff, researcher at ASU's Global Security Initiative, testifies at a UN meeting of experts to consider lethal autonomous weapons. Dr. Roff's testimony focuses on unintentional risks of autonomous weapons systems, including artificial intelligence and human control. The meeting is being held under the auspices of the Convention on Certain Conventional Weapons from April 11-15 in Geneva. Photo courtesy of Heather Roff

April 13, 2016

The world’s militaries are close enough to wielding weapons that decide on their own whether to kill — called Lethal Autonomous Weapons Systems — that the United Nations this week gathered a select few experts, including an ASU researcher, to separate fact from fear of robot overlords.

Dr. Heather Roff, research scientist with ASU’s Global Security Initiative and current Oxford senior research fellow, is testifying on the unintended risks of such weapons, and where many autonomous or semi-autonomous weapons currently exist. Her research focuses on artificial intelligence and human control.

Question: Why would we want weapons systems that control themselves? Do we have some already?

Answer: There are real potential advantages for the use of autonomous weapon systems (AWS), including force protection and low cost — in both monetary and human terms. There is also the increased ability to maneuver in “denied” environments, like underwater or in areas with jammed communications. In this way, AWS could be a “force multiplier” — that is, maximizing the effects and impacts of force without having a person to do it. However, the concerns outweigh these potential benefits. Use of AWS could lead to an artificial-intelligence “arms race,” where states attempt to realize stronger and stronger AI to outmatch their opponents. This alone is troublesome, but there are other grave risks too, such as the risk of malfunction or fratricide, the escalation of a conflict, increased levels of deniability, stressors on accountability and the potential for mass surveillance of populations. The potential for mass human-rights abuses overshadows short-lived advantages in a denied warfighting space. 

Q: Are we at risk of a real-life Skynet?

A: The key issue here is whether states want to delegate the decision to kill to a machine. If they do not want to relinquish that authority to a tool, and reserve that only for human beings, then they cannot support AWS. I understand that many people may think of AWS as some sort of precursor to “Skynet,” but I don’t think that is a helpful analogy. If we jump from AWS — something that can engage and fire without human intervention — to Skynet, then we fail to see the issues that are in front of us now: systems that may not be easily testable under combat conditions, are not verifiable, may not be predictable or reliable, systems that would breed incentives for human-rights abuses, or could trigger conflicts due to unintended engagements.  

Q: When we think about autonomous weapon systems, people may think of “drones.” Are they right?  

A: AWS are not “drones.” A remotely piloted aircraft — or RPA — is piloted. In other words, there is a human there from the outset.  Not only is a human piloting the aircraft, there are many other humans engaged with the operation of and decision to use lethal force. As to whether “drones” would become autonomous, the answer is “sure.” Autonomy is a capability. With the right software, we can make unmanned systems autonomous. Look to, for example, the U.S. Navy’s swarm boats or Google’s self-driving car. The difference, however, is that those vehicles are not armed and making lethal decisions. I think autonomy will challenge diplomacy because of difficulties with transparency, signaling intent between states, showing resolve and permitting confidence-building measures. The ubiquitous use of these systems could change the face of conflict in ways we don’t yet fully understand.

To schedule an interview with Dr. Roff, please contact Logan Clark at the ASU Office of Media Relations and Strategic Communications at mediarelations@asu.edu

More Local, national and global affairs

 

Sacramento Scholars, spring 2024, ASU, California State Capitol

ASU Sacramento Scholars learn about government through hands-on experience

Brian Lizarraga of Sacramento, California, is in his first year as an undergraduate at Arizona State University. Being an out-of…

April 16, 2024
Members of EU meet with students at ASU conference room

EU delegation visits ASU with an eye toward collaboration on semiconductors

Arizona State University has attracted nationwide attention for its innovation related to the CHIPS and Science Act of 2022. Now…

April 11, 2024
Group photo of four ASU students, with one holding a small trophy.

ASU breaks into ACF Nationals

Written by Victor Johnson For the first time in ASU Quiz Bowl history, the team has qualified for a spot at the ACF Nationals,…

April 05, 2024