10 things to know about 'killer robots'

Autonomous weapons systems decide for themselves who they attack, and when. So far, the biggest military powers have rejected the introduction of binding rules to govern their use. Here are the key facts.

What weapons are we talking about?

The official designation is "Lethal Autonomous Weapon Systems,” or LAWS for short. They're often referred to as "combat robots," but they can also be drones, unmanned U-boats or vehicles. Critics usually call them "killer robots."

What exactly is meant by this?

There is no general definition of weapon systems like these. The International Committee of the Red Cross (ICRC) describes them as weapons with autonomy in their critical functions of selecting and attacking targets, meaning that they do so without human intervention.

Do these weapons already exist?

Completely autonomous weapon systems that deploy lethal violence against humans do not yet exist. However, rapid progress in the fields of artificial intelligence and robotics mean their development is only a matter of time.

What are the precursors to them?

There are already many weapons that have autonomous functions. Rockets are already able to identify, select and attack their targets themselves. Unmanned U-boats conduct autonomous sweeps for mines. Drones are already capable of networking in swarms and carrying out certain tasks autonomously.

What would be a conceivable scenario in which LAWS might be deployed?

A combat robot is fed the biometric data of a "target person," which it then autonomously seeks, finds and kills — without any remote control by radar or satellite, as usually happens with unmanned drones. Lethal autonomous weapons systems pilot themselves and don't require a soldier to give the order to fire. Nobody knows when they will strike.

Drones can be used as toys, but with AI they can alsobecome fearsome autonomous weapons

Why do experts see this as a paradigm shift in warfare?

Because the life-or-death decision is not made by a human, but by a machine. It makes no difference that the weapon was built and programmed by humans.

Why do autonomous weapons conflict with international humanitarian law?

One of the most important elements of this law is that, in any attack, belligerents must clearly distinguish between combatants and civilians (the Principle of Distinction). Civilians and civilian buildings must be spared wherever possible. Autonomous weapon systems can't do that. This is why specialists in international law are demanding that a human being must always have final control over an attack (the "man in the loop").

What international negotiations are taking place on these issues?

Discussions have been taking place at the United Nations in Geneva since 2014, within the framework of the Convention on Certain Conventional Weapons (CCW). Initial informal discussions became official negotiations in 2017, with more than 70 countries participating, as well as scientists and NGOs. A fresh round of negotiations is taking place from 27 to 31 August, 2018.

How are the negotiations going?

Slowly, because the various countries' positions are very far apart. The world's biggest military powers are against banning these weapons, which they want to use to secure their supremacy. There's no agreement on the horizon.

What's Germany's position?

The coalition agreement between the CDU/CSU and the SPD says: "We reject autonomous weapon systems devoid of human control. We want to see them proscribed around the world." Nonetheless, Germany has not joined the group of countries who are demanding a binding prohibition of autonomous weapons systems. Rather, the German government is arguing for a nonbinding political declaration as a precursor to a treaty banning them.

Each evening at 1830 UTC, DW's editors send out a selection of the day's hard news and quality feature journalism. You can sign up to receive it directly here