Once the stuff of science fiction, robotics are already changing the way wars are being fought, says P.W. Singer. How will they affect the politics, economics, laws, and ethics of warfare?
JOANNE MYERS: Good afternoon. I'm Joanne Myers, Director of Public Affairs Programs, and on behalf of the Carnegie Council I'd like to welcome our members and guests and to thank you for joining us.
Today I'm delighted to welcome back P.W. Singer to the Carnegie Council.
Dr. Singer is a person who likes to tackle provocative topics, and does so in a very engaging and instructive manner. Subjects such as child soldiers and military outsourcing are subjects of his previous award-winning books, both of which were discussed earlier at the Carnegie Council. The transcripts for his talks on Corporate Warriors and Children at War can be found by visiting our website at http://www.carnegiecouncil.org/index.html.
Today he will be discussing his latest publication, which I know you will find just as fascinating as his previous ones. Wired for War: The Robotics Revolution and Conflict in the 21st Century is a very exciting read and has already been featured in the video game Metal Gear Solid 4: Guns of the Patriots.
Wars are times of innovation in weaponry. Stealth drones; GPS-guided smart munitions; antitank bombs that guide themselves; robots used in such tasks as reconnaissance, bomb disposal, handling of hazardous materials, guard duty, medical evacuation, and combat—are all the newest artillery being employed to go places too dangerous for soldiers to venture to.
No, I am not describing The Terminator, Blade Runner and Star Wars style science fiction, but the grim reality of warfare today. On the battlefields of Iraq and Afghanistan robots are killing America's enemies and saving American lives. For the first time in history, the word "warrior" is being separated from the word "person."
But what happens when science fiction starts to play out on modern-day battlefields? What happens when robots take the place of soldiers in some military tasks previously assigned to humans that are now being handled by machines?
What are the economic, legal, and ethical implications? For example, is system malfunction a justifiable defense for a robotic fighter plane that contravenes the Geneva Conventions and mistakenly fires on innocent civilians?
Our guest is a leading expert on 21st-century warfare and the rise of new actors in conflict. In Wired for War he examines robotic warfare by tracing the early development of robotics and their use on the battlefield. He lays out where we are now and the consequences for the future. Some of what he has to tell you may be familiar, but most of it you will find surprising, entertaining, and very thought-provoking.
Please join me in welcoming a person who is always well ahead of the curve, our guest this afternoon, who came from Washington just to be with us, P.W. Singer.
Thank you.
P.W. SINGER: I appreciate that kind welcome.
I thought what I'd do is begin with a scene of war:
"There was little to warn of the danger ahead. Hidden alongside the road, the IED (improvised explosive device) looked like any other piece of trash. By 2006, there were more than 2,500 of these attacks every single month, and they were the leading casualties among both American troops in Iraq and Iraqi civilians. The team that was hunting for this IED is called an EOD (explosive ordinance disposal) team. They are the pointy end of the spear in the effort to tamp down on these roadside bombs.
"Each EOD team goes out on approximately two bomb calls every single day, every day defusing two bombs in Iraq. Perhaps the best indicator of their value to the war effort is the fact that the insurgents put a $50,000 bounty on the head of an EOD soldier.
"Unfortunately, this particular call wouldn't end well. By the time the soldier had advanced close enough to see the telltale wires on the bomb, it was too late. It exploded in a wave of flame.
"Depending on how much explosive has been packed into an IED, you have to be as far away as 50 yards to escape death or injury from the fragments. The blast is so strong that it can even break your limbs without even being hit by fragments. And, unfortunately, this soldier had been right on top of the bomb. So when the rest of the team advanced, they found little left.
"That night the unit's commander did his sad duty and he sat down to write a letter back to the United States about the incident. In his condolence, the chief noted the soldier's bravery and sacrifice and he talked about how hard the rest of the unit had taken that loss, that the soldier had been their bravest, had saved their lives on numerous occasions, and he apologized for his inability to bring him home.
"But then he talked about the silver lining that he took from the incident, and this is what he had to say: 'At least when a robot dies you don't have to write a letter to its mother.'"
Now, that scene may sound like science fiction, but actually was battlefield reality, as you are soon going to see here. So something big is going on in war today, and maybe even the overall history of humanity itself.
The U.S. military went into Iraq with a handful of drones in the air. We now have 5,300 in the inventory. We went in with zero on the ground. We now have 12,000 on the ground. We went from zero to 12,000 on the ground.
We have to remember these are the Model T Fords, the Wright flyers, compared to what's coming. This is the first generation. And yes, the term "killer application" is not just something that applies to what iPods have done to the music industry. They are becoming armed with everything from, as you'll see, machine guns, rocket missiles—you name it.
That's where we're at right now, those numbers that I just mentioned.
One Air Force three-star general that I met with said that it's not unreasonable to postulate the next conflict involving tens of thousands of robots. And let me be clear here, we're not saying tens of thousands of today's robots, like those that you're seeing here, but tens of thousands of tomorrow's robots, because one of the things you have in technology is Moore's Law, the fact that you can pack more and more computing power into each microchip, basically that they double in their computing power every two years. So what that means is that within 25 years these systems will be a billion times more powerful in their computing. And I don't mean a billion like the sort of amorphous Austin Powers' "one billion." I mean literally take the power of your iPhone and multiply it times a billion.
What that means is that the kind of things that we used to only talk about at science fiction conventions like ComiCon have to be talked about in the halls of power and places like the Pentagon. A robots revolution is upon us.
Now, I need to be clear here. When I say "a robots revolution," I don't mean you have to worry about the Governor of California showing up at your door. It's not that kind of revolution. It's a social science revolution. It's a revolution in history.
It's akin to, for example, the advent of the Atomic Age. But it may be even bigger, because our new unmanned systems don't just affect the how of war fighting, but they affect the who of war fighting at their most fundamental level. That is, every previous revolution in war—like the machine gun or the atomic bomb, as I mentioned—was always about a weapon that could either go further, shoot faster, or had a bigger boom. That's happening with robots.
But it also is affecting the very experience of warriors themselves, and even the very identity of warriors themselves. When I say "who," I mean it this way: humankind's 5,000-year-old monopoly on the fighting of wars is breaking down in our lifetime.
Now, I thought that was kind of a really big deal, and so I set out to basically capture all the stories of everyone who was involved in this, meeting with robot scientists; the science fiction authors who inspired them; 19-year-old drone pilots who are fighting wars in Iraq but never leaving Nevada; the four-star generals who command them; the politicians who send them out to fight; the Red Cross and Human Rights Watch activists who are trying to figure out the laws and ethics of this new space; the Iraqi insurgents, what do they think about these systems; news editors and journalists in Pakistan and Lebanon—basically trying to get that full 360 view of what's going on and capture those stories.
But the interesting thing is not only are the stories fascinating, but I think they shine a light on some of the broader trends that are starting to ripple out in terms of the effects on society, on economics, on law, on ethics—you name it. So what I'd like to do is basically flesh a few of those out for you.
The first is the fact that the future of war, even a robotic one, is not purely an American one. There is no such thing as a permanent first-mover advantage in technology. I mean how many people here still use Wang computers? It's the same thing in war. It was the British and the French who first used the tank; it was the Germans who figured out how to use it better. And so what we have to worry about is that, although we are ahead of the game right now in military robotics, we're not the only player. In fact, 43 other countries are working on military robotics right now, including countries like Russia, China, Pakistan, and Iran.
And we have to worry about where the state of American manufacturing today and the American education system, particularly when it comes to science and mathematics, will take us in this revolution. Another way of putting it is: What does it mean to use soldiers whose hardware is increasingly made in China and whose software is written in India?
Just like software has gone open source, so has warfare. Unlike an aircraft carrier or the atomic bomb, these are not systems that require a huge industrial complex to make them. A lot of them use commercial technology, or even what they call do-it-yourself. For about $1,000 you can build a drone that is much like that Raven drone you saw the soldier toss into the air and fly. For $1,000 you can build one of those at home. That means that there are some interesting things that ripple out of that.
For example, one of the things that the book captures is the story of a group of college students who fund-raised to do something about Darfur, and a private military company (PMC) approached them and wanted to rent them a drone. They were still in college. They had their communications with the PMC actually out of their dorm room.
But there's a darker side to this, of course: people who we might not want to have these systems may get them. So, for example, during the war between Israel and Hezbollah, a war between a state and a nonstate actor, Israel flew drones; but so did Hezbollah, which flew four different drones back at Israel. There has been on radical websites the ability to remotely detonate an IED in Iraq while sitting at your home computer. We have even had one of our robots captured and retooled and sent back at us as a mobile IED.
I think we should see the combination of robotics and terrorism affecting the future of conflict in two trends. One is that it reinforces the empowerment of individuals versus states, continuing something already in play. The second is that it widens the game of terrorism. That is, it brings new players in, either because of capabilities or because they're so upset about the change that's surrounding them that they lash out. You know, we had the Luddites 200 years ago. We have concerns today about the neo-Luddites.
You have the deep ethical questions that robotics raise about human identity and human law. Remember, people are willing to blow up abortion clinics and use sniper rifles against doctors. Is it so much to think that they might be willing to do something here?
So the way people lay this out is effectively the future may be "al Qaeda 2.0 meets the next-generation version of the Unibomber." And you have to remember you don't have to promise a robot that it is going to be greeted by 72 virgins to convince itself to blow up.
Now, the ripple effects of this go outwards in other areas, such as our politics. One of the people I met with was a former assistant secretary of defense for Ronald Reagan. He put it this way: "I like these systems because they save lives. But I also worry about more marketization of wars, more "shock and awe" talk to defray discussion of the cost. People are more likely to support the use of force if they view it as costless."
I took that and I thought, You know what we have going on here? There are certain trends already active in our body politick that may be taken to their final, logical ending place via robotics.
We don't have a draft right now. We don't have declarations of war anymore. We don't buy war bonds anymore, or pay higher taxes because of war. And now we have a trend that takes more and more of those Americans who would be going into harm's way, and the families that care about them, and instead we're replacing them with machines. So what that may be doing is taking the already-lowering bars to war and dropping them to the ground.
But the future war is also going to be a YouTube war. That is, our technologies don't merely remove the human from risk; they record everything that they see. So they don't just delink the public; they reshape its relationship with war.
There are several thousand video clips already up online of combat footage from Iraq or Afghanistan, much of it gathered via drones. Now, this ability to watch war on your iPhone or on your home computer could arguably be a good thing. It could be creating new connections between the home front and the war front, able to see what's happening in war as never before. But remember, it's playing out in our real world, our strange real world, and so it's becoming a form of entertainment for some people. Soldiers call this "war porn."
A typical example that I was sent was an email that had an attachment of a video clip. The video clip was of a predator drone taking out an enemy site in Iraq. The missile hits, there's an explosion, and some of the bodies are tossed into the air. The video clip was set to music. It was set to the song "I Just Want to Fly" by Sugar Ray, a pop hit from a couple years ago.
So we have coming—or actually, we already have right now—the ability to see more but experience less. And we need to remember that not everyone is fighting from afar, and the violence is still real.
The sports parallel this brings to mind is the difference between watching a professional basketball game on TV, where the players are tiny little figures on the screen, versus seeing a basketball game in the arena, where you really do see what someone who's seven feet tall really does look like. It's fundamentally different. As is actually playing that game yourself.
But we need to remember these are the clips. So it's not even like watching the full game. It's like watching the highlight reel; it's like watching the ESPN Sports Center version of the war. All the context, all the strategy, all the training—it all just becomes slam dunks and smart bombs.
The irony of this is that while the future of war may involve more and more machines, it's still being driven by our human failings, and it's human psychology that's shaping all these ripple effects outwards.
A real policy example of that, that we have right now facing us, is robotics' impact on our very human war of ideas that we're fighting against radical groups. That is, what is the message that we think we are sending with these systems, versus what is the message that is actually being received?
One of the people that I met with on this was a very senior Bush administration official, who believed that our unmanning of war "plays to our strength. The thing that scares people is our technology."
But when you speak with people in, for example, Lebanon, there's a different perception. One of the people that I spoke with there was an editor of one of their leading newspapers. As we were talking, a drone was flying above him. This is what he had to say: "It's just another sign of the cold-hearted, cool Israelis and Americans, who are also cowards, because they send out machines to fight us. They don't want to fight us like real men, because they're afraid to fight. So we just have to kill a few of their soldiers to defeat them." There is a big difference between the message we think we're sending versus what the other side is receiving.
Or, as one Pentagon analyst put it: "The optics of this look really freaking bad. It makes us look like the Evil Empire from Star Wars and the other side look like the Rebel Alliance."
But the future of war is also featuring a new type of warrior with a new experience of war itself, what I call a "cubicle warrior."
And it's redefining the very meaning of the term "going to war." My grandfather went to war in World War II in the Pacific. That meant he went to a place of danger, and that meant he went to a place from which the family didn't know whether he was ever going to come back again.
Now, compare that with the experience of a predator drone pilot talking about his wartime experience fighting in the Iraq War and the War in Afghanistan while never leaving Nevada: "You're going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants. And then you get in the car and you drive home, and within 20 minutes you're sitting at the dinner table talking to your kid about their homework."
It's not easy, and the psychological disconnects of this are very tough on these new warriors. In fact, they have higher rates of PTSD (post-traumatic stress disorder) than many of the units physically in Iraq.
But these trends of disconnection also have us concerned about something beyond it. This is what one young pilot described of the experience of taking out enemy troops from afar: "It's like a video game."
But as anyone who has played Grand Theft Auto knows, or if you have your kids that play those video games, we know that there are things we do in the virtual world that we wouldn't do in reality or face-to-face. So the lines can be very messy in between.
So much of what you're hearing here is that there is always another side to technologic revolutions. Moore's Law is operative, but so is Murphy's Law. That is, the age-old fog of war isn't being lifted. The enemy still has a vote. And, most importantly, we're experiencing and learning about new human dilemmas. So we're getting great capabilities, but also new questions for us to ponder.
Sometimes these dilemmas that pop up are just easily fixable "oops moments." That's how one robot's company executive described it: "when things don't go right with robots, it's just an oops moment."
What is an oops moment when you're talking about robotics and war? Well, sometimes they're kind of funny, like one actually pretty much recreated a scene from that old Dudley Moore/Eddie Murphy movie Best Defense. Probably you don't remember it. They had—and this is in reality, not in the movie—a machine gun-armed robot during a demonstration that actually ended up pointing its gun at the reviewing stand of VIPs. Fortunately, it didn't have any bullets in the gun. But it was an oops moment.
Other times they're tragic, like there was a real-world recreation of the famous scene from the movie Robocop, where, just last year, an antiaircraft cannon in South Africa had a "software glitch" and ended up firing in a circle. Nine soldiers were killed.
Or you have questions of what I call unmanned slaughter. That is when someone is killed that you didn't intend to kill. For example, the three times we thought we got bin Laden with predator drone strikes and it turned out not to be the case.
That's where we're at right now. That's without armed autonomous systems, the kind of robots that we think about in science fiction. Sometimes people will say, "No, no, that will never happen." I joke, "It's kind of like the Lord Voldemort of the issue;" it's the "issue that shall not be discussed" from the Harry Potter books.
The reality is this: I came across four different Pentagon research projects on various aspects of armed autonomous systems. It is coming.
So it raises the question of things like: Will you have more or less war crimes? On one hand, robots are emotionless; so they don't get angry when their buddy is killed, they don't commit crimes of rage or revenge, which is how a lot of war crimes happen. But robots are emotionless. To a robot, an 80-year-old grandmother in her wheelchair is just the same as a T-80 tank; they're both just a series of zeroes and ones.
And so what we have to figure out is: How do we catch up our 20th century laws of war, that are so old right now that they qualify for Medicare, with 21st century technologies?
That's one of the fascinating things that came out of the meetings I did at places like the International Red Cross or Human Rights Watch, where they're equally flustered on what to do with these systems. In fact, there's a scene in the book at Human Rights Watch where two of the people there actually get into an argument on what should apply and what shouldn't, and they start referencing not the Geneva Conventions but whether the Star Trek Prime Directive applies. It's kind of funny, but it also points to how much at a loss we are with where to take all this.
So this is the ending point for me. It sounds like I've talked about the future of war, but notice that you've only seen pictures of systems that already exist today [Singer's slides were running during this talk]. Notice that almost every one of the examples I gave you was something that has already happened. So it sets a real challenge for all of us, well before we get to the point of having to worry about your Roomba robot vacuum cleaner sucking the life out of you at night or something like that.
It's this: Are we going to allow the fact that so much of this sounds like science fiction keep us in denial about the fact that it is battlefield reality? Are we going to make the very same mistake that a previous generation did with another technology that sounded like science fiction, atomic bombs? And actually, remember atomic weapons were first conceived in H.G. Wells' novel World Set Free. The very term "atomic weapon" comes from that. The concept of a chain reaction that inspired the researchers in the Manhattan Project was also from H.G. Wells' story World Set Free.
So again, are we going to make the same mistake that a previous generation did in thinking that something was science fiction and waiting to deal with the consequences of it, and all the ethical and social and political issues that they raise, until after Pandora's box has been opened up?
Now, I could be completely wrong. One scientist that I met with, who is working on robotics for the Pentagon, said: "You're wrong. There's no real ethical or legal dimensions of robotics that we have to worry about"—"that is," he added, "unless the machine kills the wrong people repeatedly. Then it's just a product recall issue."
So there's a lot to say here. But the logical ending point is this, and it's actually to jump into the realm of science fiction. A few years ago, the American Film Institute gathered two top-100 lists of the top-100 heroes and top-100 villains of all of Hollywood history, the past century of Hollywood. This list was to gather the characters that represented the best of humanity and the worst of humanity. Out of this list, only one character in all of Hollywood history made it onto both lists, and it was The Terminator, a robot killing machine.
The point here is this: It's not just that our technology can be used for both good and ill, but that it also reveals a certain duality to the humans behind it. That is, it's our creativity that has distinguished us from every other species. It's our creativity that has taken us to the stars. It's our creativity that has developed works of art and literature to express our love for each other.
And now we are using our creativity to do something remarkable, to create this incredible technology, to create what may one day, some argue, be an entirely new species. But the reason that we're doing that is because of our inability to get beyond destroying each other.
And so the question we have to raise is this: Is it our machines that are wired for war or is it us?
Thank you.
Questions and AnswersQUESTION: There are millions of international nonprofit organizations, trying to stop or prevent conflict. Do you think they should use this kind of technology to do so?
P.W. SINGER: Maybe where I can take this is there's a question around the ethics of robotics. When people talk about it, they usually just focus on the ethics of the machines themselves, and that's the whole thing about like Asimov's Three Laws of Robotics.
The first problem with that is Asimov wrote in English; software is written in numbers, and you can't just simply drop it in.
The second is you look at Asimov's Laws, and it's things like "robots shall not harm a human." Well, you're building these systems to do so.
Or you have the fact that it was the idea that a robot should take orders from any human. Well, do you want a robot that Zawahiri can walk up to and say, "Robot, go shut down?" There are also evil humans out there who you wouldn't want to have control over them.
Or you have the question of who gets control of the information within them?
But the ethical question for me that people aren't asking is the ethics of the people behind the machines. That is, things like: What should be built? What should not be built? Who should have the authority to not only control them, but to utilize them? Is this something that we want limited just to militaries, or are we okay with it being used by police forces?
The L.A. Police Department is actually purchasing a drone to park over a high-crime area. In some ways it makes perfect sense. But in other ways you say, "Well, that actually sounds like there are rights concerns in it." So there are all these sort of ethical questions that we should be wrestling with, and I don't see us doing so yet.
That, of course, raises the final questions around arms control. The fear for me is, despite all the various organizations that you mentioned—literally, there is a cottage industry of NGOs—it is in fact, I think, in some ways undermining the very goal of these NGOS, because they become competitive with each other.
But the point is we know humanity tends to wait until the bad thing happens before it kicks into gear. You don't get international law until you have the Thirty Years War. You don't get the Geneva Conventions until you have the Holocaust. You don't get the Mine Ban Treaty until we have 100 years of landmines and literally millions of them sowed under the earth. That's my concern with these systems.
QUESTION: I have a question about the ethics of the people who create the robots. How is our concept of war changing as a result of having these robots? It seems as if, when you were giving the numbers, that we're going to have so many more numbers, as if they're just replacing humans with robots instead of upgrading what the concept of war should actually be. So I was just curious about where you ended up on that concept.
P.W. SINGER: Thanks. I didn't plant her in the crowd, I promise. It actually points to something—it's a point I don't make in the presentation.
When you work in public policy—I work at a think-tank—and you say, "I want to write a book about robots and war," that's a career risk. Not a single foundation wanted anything to do with it. That has actually, interestingly, been the case with all my other books. When I set out to write about this crazy thing of private military contractors, no one wanted anything, because it sounded like Hollywood. The same thing with the work on child soldiers. The same thing on unmanned systems.
So I decided if I was taking a risk I might as well double down. The way it is written is also a little bit of an insurgency against my own field, which is that we tend to take issues that are incredibly important and make them dry and inaccessible. And so the book is written with kind of my own voice. It's the way I would talk with my friends. And so if you're going to illustrate something, you can reference the history, but you are also likely to say, "It's just like in that movie so-and-so."
And even, weirdly enough, for a book on robotics and war, it actually has a lot of humor packed into it, a lot of jokes. There's actually a contest, if you can guess the number of hidden pop culture references in it. So it's trying to take a very different approach to how my field writes about important issues.
The question on the mimicking is a really interesting one. We're just now starting to think about how we design these systems. You're finding people started out being very utilitarian—it was designed to do the task - but you're starting to see interest in things like whether or not you can make a system that scares the other side. One of the people I interviewed talked about whether there is a way we can take advantage of humans' natural fear of insects.
There's the question of this thing called the "uncanny valley," which is basically that there have been studies of robotics that show there are two ways that people get scared of them.
One is it looks scary and freaky, like the insect ones, or whatever. Remember, in war people always have this. The Redcoats wore red and these tall hats because it was the idea of this line of monsters coming at you, that even if you shot them it looked like you didn't even hit them. Or during World War II, the Stuka bombers had sirens on them, so before the bomb even dropped people would get scared. People are saying, "Why not do that with robotics, make them visually scary or pack them on with scary sounds, all that sort of stuff?" That's all getting into it.
But the other thing with this "uncanny valley," is the fact that actually they found with humans there's a point at which when the robot looks close to human but not fully human that it's the most freaky, and there is some natural thing in us that causes us to recoil. It's kind of the fear of a zombie. It's very interesting how that plays out in terms of age and culture and the like. It's something that kind of spooks me.
But the other part of this perceptual game that you laid out is that we want to make this sound like "Oh, this is the big bad Pentagon that's doing all this and changing their perceptions of war and the like." But, interestingly enough, there was an op-ed in The Washington Post just a couple weeks ago that argued: Well, since we don't want to do something about Darfur, why don't we have unmanned systems do it for us? This was in The Washington Post.
Now, pull back from the sort of weird irony of a humanitarian intervention by inhuman machines, and just deal with the fact that it points to this aspect that we all tend to forget, that military operations are not just one-off deals. They involve you in a conflict and the consequences of it on the ground, in all the various complex social and cultural contours. So Darfur is not something that you just pop a predator drone on top of and you've fixed it. It's a complex place, and if you get involved you are involving yourself in it and you have a responsibility.
There's also the fact, of course, that operations don't always go the way you planned them.
But then, the final part of it for me is this: When you unman your humanitarian intervention, you show that you care, but you also show that you don't care enough.
QUESTION: It has been reported that the 5,300 drones in Afghanistan have been successful against al Qaeda. But then again, in Afghanistan, the war effort doesn't seem to be successful. So I was wondering, to what degree is the experience in Afghanistan an illustration of the capabilities and of the limitations of robotic warfare?
P.W. SINGER: It's a really great question. It points to something that's not just limited to Afghanistan. It points to our experience in the strikes over the border into Pakistan, where I think the exact number is 38 official cross-border strikes in the last five months. But also, for example, the Israeli experience in Gaza most recently—which is that these systems offer incredible capabilities and allow you to do things that you wouldn't be able to otherwise.
That's not only in being able to hit enemy targets without having to put your troops into harm's way. Look at the situation in Pakistan. You could not get American troops in there without a good number of them dying, and you probably would not be able to accomplish your goals.
But beyond just the air strikes, they allow you to get things like the pattern of life in a community. It's described as sort of like a cop walking the beat. This predator drone can stay above a village for days on end. And so the operators of them, even though they are in Nevada, can figure out patterns of life; they can figure out: "That pickup truck wasn't there yesterday, and why don't we backtrack to see where that pickup truck came from?"—the same way a cop walking the beat can figure out things.
It can document things. There's described in the book how a predator drone spots a group of insurgents that shoot off mortars from a pickup truck and then they get in their pickup truck and drive away, so they're nowhere near their mortars. And then, five miles away, a U.S. patrol picks up those guys. They say, "You've got the wrong men. We're not armed. You can't stop us." And then, during the interrogation, they show them the video and so they confess. It documents things.
So you get these great capabilities. But it's just like what Israel has experienced: Great, you can take out the senior leadership, and do it in a very precise manner that limits civilian casualties. But what about the 12-year-old? What is he thinking? And have you made it such that he is not going to go into the game? So that's that short-and-long-term balancing. No one has a good answer to that.
QUESTION: I'm trying to imagine how you envision, or whether you envision, how these wars are going to play out. Are they going to send drones to any country they want, send them here, and we have to have drones ready all the time in case they come in by ship or plane? It seems to me we could get into an incredible world war if there are 40 countries, as you say, that are already involved in this kind of technology. How do you envision these wars taking place?
P.W. SINGER: The first is I'm really loath to go into long-term projections, because we know people tend to do that really badly. Actually, government does it the worst. Science fiction does it the best. The H.G. Wells example that I gave—and H.G. Wells also predicted superhighways before there were automobiles—versus the fact that the government prosecuted the founder of RCA (Radio Corporation of America) for the crazy concept of trying to sell the idea that one day you would be able to talk on a radio across the Atlantic. They said, "That's such a crazy idea that you must be working a swindle."
I don't like to get into that big, broad projection. But I do think that the evidence shows we're headed in the direction of a mix, it's this mélange. For us, our own systems, certain roles are being replaced by robotics and other roles aren't in the military. You'll probably continue to see more of these mixed human/robotic teams, be it on the ground or in the air. They're called war fighters associates.
Think about this in a certain way. We don't have elevator operators anymore. We originally said, "Oh no, you'll always have to have a human in there to do it for you"—we don't have that anymore. But we still have human toll collectors, which is completely unnecessary. I think the same sort of thing is playing out in warfare in terms of individual roles.
You are going to see people using robotics in war more and more. But then, you also will see people still doing suicide bombing, because the reason someone blows himself up isn't just sort of a tactical, logical thing. There's also ethical and religious drives for that, the value that the person thinks they're getting for themselves from it. So I think we're going to see that weird mix continue.
QUESTION: You spoke about the ability eventually, or perhaps even now, through some kind of video, to watch this warfare, and it becomes a juxtaposition or an interchange between reality and entertainment.
Just a historical note before I make my point. This has already happened. During the Civil War, there were groups of people who would pack up picnic baskets and ride in their carriages or on their horses to the hillside and watch the Confederates and the Union fighting on the battlefield, and at the end of the day they would go home after this picnic. So there you already have it.
But it was a very small group of humanity that was able to view that, so it didn't really have an impact on them the way what you describe will have. I mean there could be horrifying consequences, because it has already been proven that the more human beings witness violence, the more they become inured to it.
It started, you may remember, many years ago. There was a terrible famine in Ethiopia, and for the first time it was filmed on NBC. NBC cameras went in. Americans saw these bloated bellies, the millions of children and human beings who were starving. They were so horrified by it that Save the Children collected millions and millions and millions of dollars, because of this empathy that the public had. But the more they began to witness these tragedies, the more people were inured to it. They would change the station, saying "Oh, it was the same last night."
This could really have a horrifying effect upon the psychology of mass humanity—less compassion, less empathy—and really could change the world the way we know it today. I think that's one of the unintended consequences that's the most terrifying about this. I don't know how you react to that, but I'd like to hear what you have to say.
P.W. SINGER: I think we're in concurrence on some parts of it.
First, this isn't a "what if." Those videos exist. It's not, "Could you one day watch combat?" You saw them projected on that wall. Again, you can go on YouTube right now and watch a lot of it.
I'm going to make another sports parallel to illustrate your point. If the ability to download and watch is kind of like—I used the NBA professional game—what if you're in a situation where the game is happening and no one cares to watch it? So war is not the NBA, but war becomes the WNBA. That scares the heck out of me, because this is the most visceral thing, this is the most active thing.
These are the kind of decisions that are so important. You can't look at them as something to be disinvested from, to treat decisions surrounding war like either a form of entertainment that you turn the channel or, even worse, the decision to go to war or not is just like "should we vote for the stimulus package or not; do I care about the bridge tolls or not?" That is very much a concern.
Another thing you talked about is the parallel with the Civil War. One of the things is how this affects leadership in war itself. There has always been, ever since the era of Alexander the Great—basically this ended with Gustavus Adolphus—the best generals fought on the front lines. They were as physically talented as their soldiers. Alexander the Great was known as the best soldier.
That changes around the rise of gunpowder and new communications technology. With each communications technology, the generals get further and further away off the battlefield. First, it's in the Civil War to a slight distance; in Crimea they're doing it by telegraph. So that has been happening.
Now, these systems bring the general back onto the battlefield, but virtually. That is, they can see like they couldn't before and make decisions in a tactical space like they couldn't before.
There's a scene in the book that illustrates this, where a four-star general talks about spending two hours watching predator drone footage of an individual compound until he decides whether or not to strike that compound, and then he goes in and says, "I decided what size bomb to drop"—a four-star general. That's the job of a captain. But because he could, he did.
And there's actually an ethical dimension to it. He basically was saying, "I took personal responsibility for what happened in this operation. Who knows the commander's intent better than the commander himself, so why shouldn't I make this decision?"
Now, from a military perspective you can go: "That's micromanagement. Great, you can do the job of a captain. The captain can't do the job of a four-star general Those two hours you were watching video the captain couldn't go make the strategic decisions."
But there's a political side to this as well, which is it's not just military folks who can be involved in that watching and doing. The Secretary of the Air Force said it this way to me: He worried about this being a trend that takes LBJ down to the foxhole, that politicians won't be able to resist getting involved. But it's not just getting involved at the big level of "we should bomb Vietnam or not," but it's getting involved at the very tactical "hit this guy, shoot that guy, or not."
QUESTION: We've been circling around this notion of war as a spectacle, the removal of the body of the soldier from being at risk, or of the general as you go up the hierarchy, to these remote sites in Nevada or Arizona in these trailers.
Here's the rhetorical question: Isn't the roboticization of an armed force reducing the friction that we have in a democratic society to go to war? It makes the calculus less risky for politicians because it's decorporealizing the armed forces. We're no longer at risk of wounding our bodies or being ruptured by being on-site, but rather the decisions are being made on-screen. It's the risk of the society, of the spectacle.
P.W. SINGER: That's a really good question. You see this play out in two ways.
What was fascinating is, based on interviews with tons of different people, across the board this idea that it would make us more likely to go to war was something that not only scientists said but special operations forces captains would say, assistant secretaries of defense would say. It was across the board something that almost everyone brought up.
There's a great scene in the book that has a quote that shows a human rights activist and a special ops guy. The human rights activist is working on shutting down Guantanamo. The special ops guy is working on putting people in Guantanamo. They were both in complete agreement on that point that you made.
So it works in, one, lowering the cost, and that makes it easier to go to war. And then, as one former general put it, it makes it also easier to be in "stay the course" mode, because the public doesn't get as upset, and so mistakes are harder to change out of.
But there's a second thing, which is kind of what that Reagan assistant secretary of defense said. There's the marketization effect, which is very seductive, and it's easy for leaders to fall prey to the possibilities of technology because it is so great.
Let me be clear. Again, it's not just leaders. In this Washington Post op-ed, they're saying, "We ought to do something about Darfur." They're not saying, "I want to go out and do bad things and seize countries for their oil," or anything like that. They're saying, "I want to do good things." But they're seduced by the possibility of the technology. It's hard for them to weigh the consequences of it.
And then, you add in our very interesting way of defense industry technology, which is that we often have to oversell something to get it bought. And so the capabilities often are less than what they have been promised to be.
JOANNE MYERS: I want to thank you very much for being here. It was really terrific.