The Sound of Science

Fridays 9am – Sheffield Live! – 93.2FM

Show 6 – First broadcast 20/06/2008

Drop us an email –

Armed Military Robots (part 1 of 2)

Presented by Professor Noel Sharkey, University of Sheffield

This week is part 1 of a two part special on Robots of War. Noel explains the ethical issues of using military robots that are allowed to apply lethal force on their own terms. These are not terminator style robots. they are more like tanks, trucks or fighter jets. There are over 4000 robots currently deployed on the ground in Iraq. These are mainly for explosive disposal although some are armed. There are also many robot fighter planes such as the Predators and the Reapers. For now there is always a person in the loop to decide when to kill, but this is could change soon. Find out how this ties in with the Laws of War and the international laws on discrimination.

Over the next two weeks Noel interviews experts in this field – Professor Ron Arkin, director of the mobile robotics lag at Georgia Institute of Technology, Dr Peter Asaro, philosopher of technology at Rutgers University in New York, Matt Armstrong, an independent defence analyst in California, Rear Admiral Chris Parry who worked for the UK MoD and Richard Moyes, policy director of Landmine action. Noel also attends the International Military Robotics conference in London and talks to military robotics people from Nato, the German and Swedish Armies as well as the French equivalent of our MoD.

Get involved in the discussion at the AUVSI forum on Armed Unmanned Systems

Interview 1:

Noel talks to Ron Arkin, Regents’ Professor, College of Computing, Georgia Tech about some of the dangers facing us in the near-future with robots that decide who to kill. Professor Arkin tells us about his work on developing an Artificial Conscience for a robot and about some of the difficult ethical decisions that both soldiers and robots have to make in war.

Ron Arkin and his robots

You can find the examples that Professor Arkin talks about in his interview on his web site

Interview 2:

Noel talks to the exciting young philosopher Dr Peter Asaro from Rutgers University in New York. Peter talks about a range of issues concerning the dangers of using autonomous robot weapons. He cautions us about the sci-fi future that the military seems to be heading towards and how a robot army could take over a city. Interestingly he makes the provocative claim that one of the first uses of insurgency was the early Americans against the British redcoats.

Interview 3:

Noel talks to Matt Armstrong an independent analyst specialising in public diplomacy and strategic communications working in California. Matt writes a famous blog called MountainRunner. On the programme he discusses the “hearts and minds” issues, a term he dislikes and the problems with having a robot as the “strategic corporal” of the future.

Matt Armstrong

Next Week

Part two of the International Military Robotics special next week will continue with interviews with Rear Admiral Chris Parry, Richard Moyes from Landmine action and military robotics people from Nato, the German and Swedish Armies as well as the French equivalent of our MoD.


June 20, 2008 - Posted by | Uncategorized


  1. I believe that while, LOAC may be a poor regulation of AI, it we no worse a regulation than it has been with non-artificial intelligence. First of all, we DO have programs that can differentiate emotions and estimate the likelihood of violence (“innocence discrimination”). Secondly, Proportionality does not allow robots to attack “providing there is a greater probability of killing soldiers than innocent people”. Proportionality defines the military advantage must be proportionate, not the militants vs civilians. I wrote a paper on this and it should be up in a week or so – “Replacing Soldier Intuition with Artificial Intelligence:”. Sharkey, I full-heatedly respect your goals for military use of technology, and I hope you give me feedback on my paper so I can learn from your experience and better my understanding of the situation. Thanks.

    Comment by Luke Haberkern | December 13, 2008 | Reply

    • Hi Luke,

      You open up a couple of interesting issues from our Show 6 back in June this year. This is not the most appropriate forum to discuss them in but they are very important questions and so I will discuss them briefly (to justice to my arguments you would need to read some of my journal articles on the topic). But here goes:

      There are some “laboratory systems” that have a reasonable degree of accuracy for determining emotional expression although they are easy to fool. There are also “laboratory methods” for telling if someone is telling a lie. Neither methodology is very accurate and could certainly not be used in moving target events in dirty and dangerous environments. Besides this is not what discrimination is about. In a war zone people could be very stressed and emotional about a lot of things going on and may feel guilty about a number of things that have nothing to do with whether they are a combatant or a non-combatant. That is something altogether more subtle which we can discuss at length if the opportunity arises. Discrimination in an insurgent warfare most often requires inference and judgement about the situation. It is for example not always appropriate to kill an insurgent. It may even be a mentally ill person.

      In sum, taking an emotional detection system into a war zone and using as the basis to kill people would be totally wrong and would lead to many innocent deaths.

      Now as to proportionality: the Principle of Proportionality, “requires that the anticipated loss of life and damage to property incidental to attacks must not be excessive in relation to the concrete and direct military advantage expected to be gained.” (Petraeus D.H. and Amos, J.F. Counterinsurgency, Headquarters of the Army, Field Manual FM 3-24 MCWP 3-33.5, Section 7-30). This is pretty much a parapharse from Protocol 1 of the 1977 additions to the Geneva convention.

      Although this does not mention civilian v soldier directly, it does not take much reading to discover that this is the main justfication used for collateral damage involving civilian casualties or fatalities. A recent example is the overuse of the phrase “high value targets” to refer to decapitation attacks on al-Qaeda suspects in Pakistan (Waziristan). It implies a proportionality calculation that balances the potential loss of innocent lives against the military advantage of killing the target. For example, on June 17, 2007 a failed attempt to kill Abu Laith al Libi, resulted in the deaths of 7 Afghani children. After some blustering, U.S. military officials admitted to NBC News that they had been aware of the childrens’ presence but that the commander was such a “high-value target” that it was worth the risk that some children might be casualties.

      In sum, while you are correct that proportionality is about military advantage in general, a subcategory is collateral deaths of innocents.

      Of course I am not a legislator or a military commander and so all I can do is to point out the issue to the public and policy makers and try to get discussion going. In that light I am very pleased that you have taken the time and effor to comment Luke.

      Comment by noelsharkey | December 14, 2008 | Reply

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: