Skip to content

War By Remote: Battlefields of the Future

September 27, 2013


These are my notes taken at a thought-provoking conference on military, strategic, ethical and humanitarian dimensions surrounding the use of new and emerging “remote” technologies of current and future warfare.  The presentations helpfully reminded us that this is not merely about the use of drones (although use and accountability are major issues).  Robotics that take humans out of decision processes, cyber-attacks and propaganda must all be considered and addressed in relation to International Humanitarian Law.


A conference jointly hosted by the Copenhagen University’s Centre for Military Studies and the Centre for International Law and Justice brought a selection of speakers together (including Christopher Coker at LSE, Micah Zenko from the Council on Foreign Relations and Louis Maresca from the ICRC) to look at the military, ethical and legal implications of developments in kinetic and non-kinetic weapons used remotely.  The three panels considered: Strategy and Law, Ethics and the political implications for Denmark.

Conference opening remarks:

There is public concern that remote warfare (drones hooked to TV screens) is generating a desensitising “Playstation mentality” to warfare.  There has been an element of war by remote for thousands of years: arrows, crossbows, gunpowder, rifles, snipers, artillery, aircraft, WWII (V1, V2), guided missiles…There may be a tendency to over—estimate the impact of remote technology. This conference is not just about drones.  “Remote” technology is shifting in the “robotic” domain, including systems capable of evaluating situations and making complex decisions.

Panel 1 – Strategy and Law

Micah Zenko (Council of Foreign Relations): “Contemplating Future Wars”.

To understand the next 20-25 years of warfare it is necessary to understand the granularity of the way in which war has been conducted over the last 25 years, partly because the next 25 years will be broadly the same, but also to be aware and cautious of various “mythologies of intervention”, e.g.

  • Bosnia 1995: Myth that US employment of air power in Bosnia in 1995 was the key factor that “worked” in resolving the conflict
  • Kosovo 1999: same myth about US airpower
  • Libya 2011: there was no enforced No Fly Zone in reality, rebel military capabilities are ignored
  • Targeted killing by drones – mythologised by TV/Media

We are living in an unprecedented era of Great Power peace – minimal numbers of interstate wars.  When wars occur, fewer die (90% fewer, compared to fifty years ago).  Diminishing utility of nuclear weapons.  It is more or less “unthinkable” for a state to use coercion and force to take control of another state.  The increasing number of democratic states assists this and this progress will endure.  Although threats to the US “have never been less”, the nature of the threat (i.e. terrorism) is not in previous classical conflict paradigms and leaves the US uncomfortable. The US official descriptions of its use of drones is at odds with their actual employment – they are not purely used to kill senior AQ chiefs: in many ways they are a “COIN” (counter-insurgency) air force for other countries, e.g. Yemen and Pakistan, killing individuals involve in local/internal conflicts with the state.  Generally the CIA does not know who they killed in a given strike.

Often US drones are used to gather intelligence for other countries – Turkey, Honduras, France – so they can do the killing.  France has killed 600 insurgents in Mali, through intelligence gathered from unarmed US “Reaper” drones”.  This gets less media attention. US drone technology expanded exponentially in 1998/99 almost entirely as a result of the hunt for Bin Laden and the need to quickly strike.  Real-time “eyes on” coupled with a weapon system (Hellfire missile fitted to a Predator drone) massively increased response time.   It was an ad hoc tool for getting one person – now its use has expanded.  Main impact of this capability is on civilian policy/decision makers – it lowers their threshold for authorising force. An additional impact of this sort of precision intelligence and strike technology has been the “mythologisation” and even “fetishisation” of Special Forces operations.   Opinions are distorted – ignores the thousands of regular troops that need to prepare the way, provide support and clear up the mess.

Increasingly, although US might say that “all options are on the table”, “boots on the ground” (which can be highly effective) seems no longer to be part of the toolkit.  Area of study that is greatly overlooked is the electromagnetic spectrum and the potential for it to be interfered with: loss of satellites can wipe out use of mobile phones, GPS, credit cards…There is not much thinking on how space can be used as an operational tool, but nothing in the world can happen without satellites.

conf1Louis Maresca (ICRC): “Future Battlefields and International Humanitarian Law”.

International Humanitarian Law (IHL) seeks to:

  • limit the effects of armed conflict
  • achieve restrictions on war means and methods

Looking a bit more into the future and “autonomous weapons” – the ability of a weapon system to operate without human intervention or supervision (“Killer Robots!”).  There is a sliding scale of autonomy: anti-missile/aircraft systems can identify targets and engage them but cannot adapt to changing circumstances. There are significant legal, ethical and social implications of “killer robot” artificial intelligence where humans are taken out of the decision loop. There is a need to assess the possible impacts of future weapons – new weapons must be used in accordance with IHL (e.g. 1995 prohibition on blinding laser weapons).  With any new weapons, warring parties need to be able to:

  • Distinguish between combatants and non-combatants
  • Take all feasible precautions to avoid civilian casualties

Human Rights Watch is campaigning for a moratorium on development of “killer robots”.  ICRC is not involved in this. Will automated weapons of the future be able to distinguish between combatants, non-combatants, injured combatants, armed civilians, un-armed soldiers, prisoners, the act of surrendering and make military judgements based on proportionality?  At the moment the most advanced systems cannot distinguish between an apple and a tomato.

Prospects – in the near future, humans will remain in the loop.  The UK policy is (thus far) not to develop weapons of this type.  But conflict in the future may become much faster and more complex – humans will not be able to keep up.  There is a concern over the willingness of society to accept such weapons systems, regardless of their legal acceptability under IHL. ICRC issued a report last year: “New technologies and warfare”.

Q&A Session 1

  • Vulnerability of drones – unnamed US General “the day drones are employed in a contested air space and a real enemy they will all fall from the sky”.  Post-Abbottabad killing of Bin Laden saw a Pak Air Force enquiry: unnamed Pak commander – I could shoot down all the US drones over Pakistan tomorrow if I was told to.
  • “Mission creep” – drones end up being normalised into other roles – monitoring US/Canada border, tracking cattle rustlers in North Dakota
  • Legality of targeting “combatants” when they are outside the battle area – e.g. moving in a different country.  Defining the Theatre of Operations becomes difficult: “Turns the whole world into a battlefield”…

Panel 2 – Ethics

Christopher Coker (LSE):

Six years after the Wright brothers first flew, the British government was having a debate about the airpower of airpower.  At Waterloo, Wellington declined to allow his artillery to fire upon Napoleon when he moved into range.  Plans to target German commanders in the 1994 invasion of Normandy were dismissed.  But we find that conventions, laws and ethical norms quickly adjust.  The risks of political decapitation include killing the one person who can authorise surrender – drone strikes undermine control in the tribal areas.  Behavioural profiling has been used in British society since the 1980s – as a means of policy.  CCTV, face recognition – soon to come will be body language recognition.  We are merely bringing this technology of “risk management” to the battlefield.  Increasing Western preference for keeping troops as far away from battlefield risks as possible.  Now facing “disassociation” risks – no empathy/understanding of the enemy (i.e. “know your enemy”) as we sub-contract work to machines.  “Machine ethnics” – might machines actually be more ethical as they are more consistent in their application of decisions.  The future of conflict: there is no “New Order” of states and power, merely a never-ending series of risks to be managed…

Michael Gross (Haifa University): “Ethics and non-Kinetic warfare – cyber warfare and public diplomacy”

There are some key trends in non-kinetic warfare – at the state level:

  • Cyber warfare
  • Public diplomacy (aka propaganda)
  • Sanctions – e.g. financial

At the non-state level:

  • “Cyber-terrorism”
  • Non-violent resistance
  • Economic warfare
  • Non-kinetic warfare has several advantages over kinetic: cheaper, human costs lower, it doesn’t provoke counter-attacks or condemnation on the same scale, “soft power” – public diplomacy (has received little public attention).

Cyber warfare: civ and mil targets are harder to distinguish between.  Examples – denial of services (Estonia ’07, Georgia ’08), destroying data (Saudi Aramco ’12), destroying equipment (Stuxnet ’10).  There can be a knock on effect onto infrastructure (dams, water purification…) and a mix of physical and psychological damage – communications nets, financial nets, water, medical/fire/police services, transport nets.  Cyber warfare attacks military targets for military advantage, cyber terrorism “targeting the innocent” – causing anxiety, stress (PTSD?), depression – is this terrorism (and therefore outlawed by ICRC)?  How should a state respond?

The Tallinn Manual on the International Law Applicable to Cyber Warfare, written at the invitation of the Centre by an independent ‘International Group of Experts’, is the result of a three-year effort to examine how extant international law norms apply to this ‘new’ form of warfare”

Public Diplomacy – aims to intensify favourable opinions, reverse hostile, attract the indifferent (or minimise, prevent indifferent becoming hostile).  Propaganda types: White (tell the truth), Black (tell lies), Grey (a mix).  A certain amount of “spin” is legitimate.  ISAF in Afghanistan “constrained by legal, political and ethical considerations” – unable to rebut or counter Taliban propaganda.  The media as a force-multiplier – ethical dilemma, is it permissible to manipulate the truth?  If so, under what conditions?  There is no injunction against lying in war.


  • Coker: why does “war” have to be kinetic/have killing?  Why not soft power instead of hard power to win a war?
  • Gross: his thinking shifted from seeing a drone-strike killing as extra-judicial execution to that of a military strike against a legitimate military target.  Human Rights organisations focus on the collateral damage aspect of a drone strike, not on whether the target was correctly identified as a legitimate military target.

Panel 3 – Remote war on the Danish political perspective: New threats but nothing new”

conf4Three Danish members of parliament: John Paulsen and Rasmus Petersen from the coalition government, Sören Pind of the centre right opposition.

  • Paulsen: yes, new threats, but they have been around for ten years or so.  Is a cyber-attack a genuine, Article V NATO-response invoking attack?  Yes – if we want it to be!
  • Probably no need for a significant shift in laws as long as we retain civilian control on their use and don’t forget about the “old” weapons systems that are still mainly used.
  • Petersen: These new weapons systems do not really change the basic rules of war – we should apply the same rules as we do for aircraft and rifles.  Opposed to use of assassination as a political tool, but in favour of use of drones for Peace Keeping Operations.
  • Pind: “We” use weapons – who is “we”, who is the “West?”.  We should be waging war as a war of ideas and in accordance with our fundamental values.  Cyber systems – still only a weapon, this is more about the way it is used than the type of weapon per se.  If war was costless, we might love it too much – target killing is a very serious issue.  Denmark should acquire drones – open civilian accountability in their use
One Comment leave one →
  1. September 30, 2013 7:24 pm

    Thanks for your comprehensive summary, Tim. Lots to consider here.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: