Autonomous Weapons Would Take Warfare To A New Domain, Without Humans : All Tech Considered Former special operations agent Paul Scharre helped create U.S. military guidelines on autonomous weapons. His new book Army of None, looks at the advances in technology, and the questions they raise.

Autonomous Weapons Would Take Warfare To A New Domain, Without Humans

  • Download
  • <iframe src="https://www.npr.org/player/embed/604438311/605045051" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

ARI SHAPIRO, HOST:

Killer robots have been a staple of TV and movies for decades from "Westworld" to "The Terminator."

(SOUNDBITE OF FILM, "THE TERMINATOR")

MICHAEL BIEHN: (As Kyle Reese) It doesn't feel pity or remorse or fear. And it absolutely will not stop ever until you are dead.

SHAPIRO: In the real world, killer robots are officially known as autonomous weapons. At the Pentagon, Paul Scharre helped create the U.S. policy for autonomous weapons. And now he has a new book out called "Army Of None: Autonomous Weapons And The Future Of War." And Paul Scharre is our guest on this week's All Tech Considered.

(SOUNDBITE OF ULRICH SCHNAUSS' "NOTHING HAPPENS IN JUNE")

SHAPIRO: Welcome to the program.

PAUL SCHARRE: Thanks. Thanks for having me.

SHAPIRO: I defined an autonomous weapon as a killer robot. Can you give us a better definition?

SCHARRE: Yeah. I probably wouldn't use language quite that sensational...

SHAPIRO: OK.

SCHARRE: ...But it captures - you know, it captures the essence of the idea. We're talking about a weapon that could go out on its own and make its own decisions about who to kill on the battlefield.

SHAPIRO: Do they exist today?

SCHARRE: You know, in some crude forms a little bit. There are at least 30 countries that have autonomous weapons that are supervised by humans for defensive purposes, things that would target incoming missiles and shoot them down entirely on their own. Now, humans are sitting there at the console. They could turn it off if they need to. But in a simple way, those are autonomous weapons.

SHAPIRO: So if people decided they wanted to race towards autonomous weapons as fast as they could, they wouldn't have far to run.

SCHARRE: Well, the technology is taking them there really whether they like it or not. Things like more advanced hobby drones, the same technology that will go into self-driving cars, all of those sensors and intelligence will make autonomous weapons also possible.

SHAPIRO: So this is not a debate over whether we should create these technologies. The technologies are already created.

SCHARRE: Right. The debate really is, what do we do with this? Do we build these? Do we build weaponized versions of them? Do you build them en masse? Do militaries invest in this and take warfare to a whole new domain, a domain of warfare where humans have less control over what happens on the battlefield?

SHAPIRO: And these debates are not only happening in the United States and Western democracies. These debates are happening in autocratic countries, in highly isolated countries, in countries that have violated international norms repeatedly.

SCHARRE: Right. I mean, Russia is building a fleet of armed ground robots for war on the plains of Europe. And Russian generals have talked about a vision in the future of fully roboticized units that are independently conducting operations. So other countries are leaning hard into this technology.

SHAPIRO: So if people listening are starting to get worried, let me just assure them it gets worse.

SCHARRE: (Laughter).

SHAPIRO: You describe a lot of terrifying scenarios. One of them is what you call a flash war, which is sort of like the flash crash that happened in the stock market partially as a result of automated trading. What is a flash war?

SCHARRE: Well, just as we've seen in arms race and speed and stock trading where stock trading now has moved to time speeds in milliseconds where humans cannot possibly be engaged and compete, the fear is that we'd see something similar in warfare where countries automate decisions on the battlefield, taking humans out of the loop because there's an advantage in speed.

But just like we've seen accidents in stock trading where algorithms are interacting in surprising ways and you get things like flash crashes, the worry is that you get an equivalent - a flash war where algorithms interact in some way and the robots start shooting each other and running amok, and then humans are scrambling to put a lid back on it.

SHAPIRO: You also raise the possibility that autonomous weapons could save lives because machines wouldn't make the same mistakes that people make. Explain that.

SCHARRE: Well, that's certainly one of the arguments against a ban or people even arguing in favor of building these weapons. And I'd compare them to looking at cars. Just like self-driving cars could someday make the roads much safer, some people have argued, well, maybe autonomous weapons could be more precise and more humane by avoiding civilian casualties in war and only killing the enemy.

SHAPIRO: You also served in the U.S. military. You have fought in wars. And you describe instances where you could legally have used lethal force and killed a person, but you understood that that would not have been the right choice in that scenario. Tell us about one of those instances. And I wonder what an autonomous weapon would have done had it been in your shoes.

SCHARRE: There was an incident early in the wars in Afghanistan where we were up on a mountaintop in eastern Afghanistan near the Pakistan border. I was part of a ranger sniper team. And a little girl came along that was scouting out our position. And we watched the girl. She watched us. After a while, she left. And soon after, some Taliban fighters came, and we took care of them. And later we talked about, you know, what would we do if we were in a similar situation? Something that never came up was shooting this girl. No one discussed it. No one - it would have been wrong.

SHAPIRO: Even though the Taliban was using her as a scout and it would have been legal.

SCHARRE: Well, and here's the thing. The laws of war do not set an age for combatants. It's based on your actions. And if you're scouting for the enemy, you're participating in hostilities. So an autonomous weapon that was designed to obey the laws of war would have shot this little girl. So there is an important difference in what is legal and what is right. And that is one of the concerns that people raise about autonomous weapons, is a lack of ability to feel empathy and to engage in mercy in war. And that if we build these weapons, they would take away a powerful restraint in warfare that humans have.

SHAPIRO: So how do we make sure this doesn't happen?

SCHARRE: Well, there are a number of people who've called for an international treaty that would ban autonomous weapons. There have been conversations underway at the United Nations for five years now. But progress is moving very slowly diplomatically. Meanwhile, the technology keeps racing forward.

SHAPIRO: And we've seen Syria violate international treaties. We've seen North Korea violate international treaties. Even if there were an international treaty like this, what guarantee would there be that some country wouldn't see that as an opportunity to get ahead of the pack?

SCHARRE: Well, that is exactly one of the objections against a treaty. These treaties only really constrain countries who care about the laws of war in the first place. And so a treaty that took away powerful weapons from the most law-abiding nations and then only gave them effectively to rogue states would hardly be in anyone's interests.

SHAPIRO: So I ask this only half in jest - are we doomed?

SCHARRE: I mean, I think that's one of the things that the book really wrestles with, is is this inevitable? Do we control our technology, or does our technology control us?

SHAPIRO: Well, that does kind of dodge the question. Are we doomed?

(LAUGHTER)

SCHARRE: You know, one of the things I walk through in the end of the book is, what are some options going forward? I think there are ways to think about narrower regulations that might be more feasible to avert some of the most harmful consequences. Maybe a more narrow ban on weapons that target people. And there has been some discussions underway internationally in trying to frame - reframe the issue and think about, what is the role of humans in warfare?

So if we had all the technology in the world, what role would we want people to play in war and why? I think that's a valuable conversation to have. And, you know, we do have the opportunity to shape how we use technology. We're not at the mercy of it. The problem at the end of the day isn't the technology. It's getting humans to cooperate together on how we use the technology and make sure that we're using it for good and not for harm.

SHAPIRO: Paul Scharre's new book is called "Army Of None: Autonomous Weapons And The Future Of War." Thank you for joining us.

SCHARRE: Thank you. Thanks for having me.

Copyright © 2018 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.