Stop killer robots?

The US Department of Defense attempts to allay fears

Download Perspecs

Should you fear killer robots?

By Daniel J. McLaughlin

Killer robots may seem like a scary creation from science fiction, but these lethal autonomous weapons are actually a reality.

There are calls to ban killer robots across the world - with an international treaty proposed to stop them getting into the wrong hands.

However, the US Army promises that humans will always make the final call on whether the robots should shoot to kill.

The Claim

According to the Independent, the world is calling for an international treaty to stop killer robots before rogue states get their hands on them.

Human Rights Watch argues that by allowing these killer robots - or fully autonomous weapons - to be developed, "we are crossing a moral line".

Mary Wareham, who co-ordinates the Campaign to Stop Killer Robots at the organisation, likened the movement to the efforts to remove landmines from battlefields.

She said: “The efforts to deal with landmines were reactive after the carnage had occurred. We are calling for a pre-emptive ban."

Wareham added: “We are not just concerned about use in warfare, but use in other circumstances, policing, law enforcement, border control, there are many other ways in which a fully autonomous weapons could be used in the future.”

There is widespread support for banning these robots. A poll conducted across 26 countries found that 60 per cent of respondents oppose lethal autonomous weapons that can kill without human input. Only a fifth said they backed the machines.

The Counterclaim

However, the US Department for Defense is trying to allay fears about killer robots.

They said that humans will always make the final decision on whether armed robots can shoot, the BBC reports.

The Department for Defense is hoping to update its Advanced Targeting and Lethality Automated System (ATLAS), used to help soldiers aim their guns on ground combat vehicles.

In an updated proposal, the department emphasised the key role of humans in the aiming process.

The US army remains "committed to the rules governing human-robot interaction, known as directive 3000.09, which require a human finger on every trigger".

An official said that the US Army was not putting robots in a position to kill anyone.

The Facts

A ban on killer robots was blocked after a week of talks in Geneva involving the United Nations in September.

A group of advances military groups, including the United States and Russia, prevented progress on an international treaty to ban the lethal autonomous weapons. South Korea, Israel and Australia were among the group that blocked the treaty at the UN's Convention on Certain Conventional Weapons (CCW).

The CCW requires consensus in their decision-making process.

In November, Russia insisted on limiting the amount of time during which the proposal could be discussed in 2019.

Download Perspecs
Download Perspecs