Military Robots and War Ethics

With the increasing number and autonomy of robots in military conflicts these days, I think its time for Christians to start asking the ethical war questions:

  • What, for instance, are the implications for Just War doctrine?
  • Can a robot make a moral decision? Can it judge proportionality?
  • If not, how can increasing autonomy be ethically justified?

And then there are the nuts and bolts questions:

  • Can a robot kill humans in self defence or must a human life be at risk?
  • Can a robot exercise autonomy where collateral damage risks exist?

I hear echoes of Asimov.

Just War doctrine rests on a presumption that combatants are human. With the robotisation of war now upon us, that presumption must now be abandoned. So what now? A new ethical minefield is opening up.

12 thoughts on “Military Robots and War Ethics

  1. I still think that what is far more important than the theics of military robots is the ethics of those who build and deploy them. Robots are just weapons, after all. I suppose they must have had the same argument over gunpowder and killing at a distance.


  2. The difference between robots and gunpowder is that if I’m close enough to shoot a gun at someone, then they’re probably close enough to shoot a gun at me, too. So I’m still putting myself in some level of danger.
    However, if I had one, I could send a robot to kill a person without ever actually putting myself at danger. I personally find that ethically troublesome on many levels.
    I’m also reminded of what I think was a Star Trek (Next Generation, maybe?) episode where they visited a planet that was going through a “computerized” civil war. Rather than gearing up with actual weapons, both factions had long ago agreed to let a computer simulate the various battles of the war. The computer would report the results of each battle and the “casualties” on both sides would then report for a lethal injection. The society decided that this was somehow more humane than actually riddling each other with bullets or blowing each other up. The whole thing was ghastly in that clinical sort of way. (Of course, the Enterprise became involved because members of its away team was in a location that was “bombed” during one of these simulated battle and they were expected to report for lethal injection.)
    Robots in war tends to make me think of the same kind of clinical, inhuman approach to warfare.


  3. Steve, well, what I am suggesting is that we look at the developments holistically. I am reminded of Einstein, who worked on the development of the atomic bomb, only to later repent and became an anti-nuclear advocate. It was not just the military pilots who had an ethical responsibility, it was not just the military planners, it was also the military researchers. Einstein belatedly recognised this. If we do not address the ethics in the research phase we will be repeating his mistake. And it would appear the window is a matter of years, not decades.
    Jarred, in some ways I think this parallels the debates over land mines, which, incidently, are being phased out by the US military because they are so ethically problematic. And one of the important ethical questions that crops up is, can a robot accept a surrender? For a war to be just in terms of the just war tradition, the enemy must be given the option of surrender. But are robots sophisticated enough to discern surrendering behaviour without human oversight? The discussion papers I have read would suggest not. If fact, even worse, they suggest that robots are not sophisticated enough to even distinguish between combatants and non-combatants! Can robot autonomy then be justified? Even combatants must be granted non-lethal options. This question reminds me of a scene in Robocop, where a military robot malfunctions and guns down a business exective during a live demonstration in a boardroom, while he has his hands held high in surrendering pose. It’s a horrible scene. How long before we see a real life equivalent on youtube?


  4. Military robots would be basically another step in technology doing the messy work of war. Releasing a robot is like sending a missile. If a robot is unable to be negotiated with, neither is a missile. The time-range is greatly increased, like a landmine that long outlives the war.


  5. Check out these videos of some of the latest military robot technology
    Boston Dynamics Big Dog
    Foster-Miller TALON
    Army ‘Crusher’ Robot in Action
    Viper – A Battle Robot
    Military Killer Robot Planes
    New Armed Robot Rolls Out
    Military robots and the future of war
    SARCOS half human half robot
    Israel: Military Robot Snake

    Army Robot – The Mule

    Crusher (CMU’s military Unmanned Ground Vehicle)

    Crusher highlights

    Unmanned micro aircraft being deployed in Iraq


  6. Actually, you know the really freaky thing? I was recently reading about the latest advances in invisibility technology in New Scientist. They’re now exploring chameleon technology, having realised that the illusion of empty space is just one type of illusion that they could create. Put the two technologies together, killer robots with chameleon ability and I’m having visions of Arnie.
    For those who think this is unbelievable, that I’ve been smoking something, just watch this demonstration video.
    Now, what you are watching is what was publically disclosed in 2006, a few years ago. It sounds like they’ve already moved way beyond and the military is very interested.


  7. Dang, one of my comments was lost. Ok, try again, have a look at these VIDEOS of some of the latest military robots.
    Boston Dynamics Big Dog
    Talon, a combat robot
    New Armed Robot Rolls Out
    Military robots and the future of war
    SARCOS half human half robot
    14A3 talon 360 terminator


  8. What’s the difference between those who hijacked planes in the USA 8 years ago and crashed them into buildings, and those membersd who bombed Yugoslavia 10 years ago, killing about the same number of people?
    In the former, the perps all died, in the latter, none of them did.
    Robots are just one means to achieve that kind of discrepancy.


  9. I’ve been thinking. The challenges to just war theory are obvious. Well, at least some of them are obvious. But I think there are potentially some implications for pacifist theory and practice as well. For instance, might the robotization of war undermine the grounds for consciencious objection? If you could participate in war without killing anyone? Is pacifism about more than just refraining from killing people? If so, would authorities be prepared to recognize this?


  10. I heard a very interesting discussion on Radio National a couple of months ago on their show called “Future Tense” which might be of interest to people following this thread. Following is a quote from the transcript and the link to it and some further information.
    Antony Funnell: I’d like to ask you in a few minutes about the ethics and the legal side of this all, but in terms of the attitude of the military commanders, how has that changed towards robotics over the last couple of years, as they’ve started to see success with these robots in Afghanistan and in Iraq?
    PW Singer: One of the people that I interviewed for the book I think put it pretty well. He was an executive at one of the largest robotics companies here in the US, and he said, effectively, before the 9/11 attacks, they couldn’t get anyone to return their calls. Afterwards, people in the military were calling them and the exact quote was ‘Holy crap, we can’t get enough of these systems.’ And the numbers show this, that people in the military want more and more of these systems, because they are saving lives, they’re able to send these systems out in the places where they would have had to send soldiers before. There’s hundreds of soldiers that are alive now because of these systems.
    But there’s also a little bit of discomfort about it, a little bit of concern of what is this doing to the warriors’ identity? What is this doing to their place in the whole relationship of war. What is it doing to the public’s relationship with war? Is it making the military more distant? And one of the people that I met with for the book was a special operations officer, a Green Beret, he basically described how this whole trend very much scares him, but he also sees it as inevitable.
    Antony Funnell: I was going to ask you about that. If you’re a soldier on the ground, you can imagine you’d be happy if your side had these robots, but is there already fear there among US soldiers that they may also face robots as enemies?
    PW Singer: Right now it’s not so much on the other side using them, as two things: one, what is this doing to our identity, what is this doing to our situation in war? And there’s a lot of tension brewing in particular between the soldiers that are still fighting on the ground and those that are increasingly fighting from afar. You know, you have soldiers who were slogging it out in Afghanistan, but above them aren’t pilots in machines, but pilots who are sitting 7,000 miles away, and their experience of war is very different; the whole meaning of the term ‘going to war’, it’s changing in our lifetime.
    Antony Funnell: And I presume the choices that they’re making from a distance about what to bomb, what to explore, they must be different than the people there in the thick of it.
    PW Singer: Exactly. One of the people on that was a Predator drone pilot and he said, describing the experience of killing someone, ‘It was like a video game’. And when you’re a Green Beret on the ground who your identity is bound up in with a warrior’s coat, this is a very hard thing to take. There’s other people who are fighting from afar. When your experience of going to war is going to a place of danger, where you may never come back, your family may never see you again, but then fighting alongside you is someone whose experience of going to war is a daily commute in their Toyota into work, and they sit down behind a computer screen and they fly a drone for 12 hours, and then they end their day. They drive back home, and 20 minutes later after putting kills on enemy combatants, they’re talking to their kids about their schoolwork. So two very big differences in the experience of war.
    The other part that soldiers in the field are concerned about, and this is one thing that I really wanted to capture in the book, is how we can’t forget the humanity of war, we can’t forget that it’s driven by our human failings, and that it’s human psychology that really shapes how it plays out. And a big area to think about is what’s the impact of these systems on the very human war of ideas that we’re fighting against radical groups, and two interviews I think really capture this well. One was with a Bush administration official, he talked about how ‘the thing that scares people is our technology’, that was his quote.
    But that’s a very different message when you go and meet with people in Pakistan, or Lebanon, and one of the people that I met with for the book who was a news editor in Lebanon, talked about how these systems just show that you (he’s talking about the Americans) ‘that you’re cowards, that you’re not brave enough to come fight us, that you’re not real men, and that all we have to do is kill a few of your soldiers, and we can defeat you.’ And so you have this fascinating disconnect between the message we think we are sending with our technology, and the message that’s being received.


  11. Came across another interesting article on Robot ethics in Cosmos Magazine, dated December last year.
    “WASHINGTON: The introduction of robot ethics guidelines is needed immediately, amid surging use of the machines and concern about their lack of human responsibility, a British researcher has said.”
    “In an article published today in the journal Science, artificial intelligence and robotics professor Noel Sharkey, from the University of Sheffield in South Yorkshire, argues that the steady increase in the use of robots in day-to-day life poses unanticipated risks and ethical problems.”
    “Outside of military applications, Sharkey worries how robots – and the people who control them – will be held accountable when the machines work with “the vulnerable,” namely children and the elderly.”
    “There are already at least 14 companies in Japan and South Korea that have developed child care robots… The question here is, will this lead to neglect and social exclusion?”
    So the scientists have recognized the need for ethics, I think it’s time we Christians did as well.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s