- Slaughterbots are weapons that select and apply force to targets without human intervention. Instead, they make their decisions with artificial intelligence software, which is essentially a series of algorithms.
- For the first time ever this year, the bulk of the 125 nations that belong to the United Nations' Convention on Certain Conventional Weapons said they wanted new laws to be introduced on killer robots.
- However, some countries that are developing these weapons, including the U.S. and Russia, were in opposition, making a unilateral agreement impossible.
A UN conference failed to agree on banning the use and development of so-called "slaughterbots" at a meeting in Geneva last week, raising alarm bells among experts in artificial intelligence, military strategy, disarmament and humanitarian law.
Slaughterbots are weapons that select and apply force to targets without human intervention. Instead, they make their decisions with artificial intelligence software, which is essentially a series of algorithms.
We're making it easier for you to find stories that matter with our new newsletter — The 4Front. Sign up here and get news that is important for you to your inbox.
For the first time ever this year, the bulk of the 125 nations that belong to the United Nations' Convention on Certain Conventional Weapons (CCW) said they wanted new laws to be introduced on killer robots.
However, some countries that are developing these weapons including the U.S. and Russia, were in opposition, making a unilateral agreement impossible. The U.K. and several other nations also objected.
"We would have liked to have seen everyone get behind that," Emilia Javorsky, a physician scientist who leads the Future of Life Institute's advocacy program on autonomous weapons, told CNBC. "All it takes is one," she added.
The conference concluded Friday, with the group pledging to "intensify" discussions and consider possible steps that are acceptable to all.
An 'epic failure'
The fact that the CCW failed to agree on anything concrete last week was hailed as an "epic failure" by Javorsky. "It is now blatantly clear this forum — whose unanimity requirement makes it easily derailed by any state with a vested interest — is utterly incapable of taking seriously, let alone meaningfully addressing, the urgent threats posed by emerging technologies such as artificial intelligence," she said.
Verity Coyle, a senior advisor at Amnesty International, said in a statement that the window of opportunity to regulate killer robots grows ever smaller as research and testing of these weapons presses forward.
"The CCW has once again demonstrated its inability to make meaningful progress – it's now time that committed states take the lead on an external process that can deliver the type of breakthrough we've previously seen on landmines and cluster munitions," she said.
Killer robots already in use
Despite what some people may think, slaughterbots are already being used on the battlefield today.
In Libya, Kargu drones made by Turkey's STM have been used in the nation's civil war, according to a UN report published in March.
These Kargu drones are small portable rotary wing attack drones that provide "precision strike capabilities for ground troops," according to STM's website.
The Kargu drones were used in Libya to hunt down retreating soldiers, according to the UN report.
"It's the first really solid documentation we have of a use case of these types of weapons," Javorsky said in reference to the report. "But we're seeing reports of this at accelerating rates. We're hearing about swarms being used and deployed and developed. So that reality is very much here today."
Companies making these drones are trying to develop AI systems that can identify the thermal signature of a human target or identify their face via a camera. But distinguishing between combatants and non-combatants requires accuracy and precision.
It's drones like STM's that campaigners are most worried about. These drones, which look similar to a normal consumer drone but have a gun attached, are fairly inexpensive to buy and relatively easy to mass produce.
Max Tegmark, a professor at the Massachusetts Institute of Technology and the president of the Future of Life Institute, told CNBC that gangs will look to try and use slaughterbots if they're affordable.
"That's going to be the weapon of choice for basically anyone who wants to kill anyone," he said. "A slaughterbot would basically be able to anonymously assassinate anybody who's pissed off anybody."
Hard to agree?
The International Committee of the Red Cross, viewed by many as the organization that is the custodian of the law of war, has called for the prohibition of autonomous weapons that are designed or used to target human beings.
Richard Moyes, coordinator of the Stop Killer Robots campaign, said in a statement that government leaders need to draw a moral and legal line for humanity against the killing of people by machines.
"A clear majority of states see the need to ensure meaningful human control over the use of force," he said. "It's time now for them to lead in order to prevent the catastrophic humanitarian consequences of killer robots."
Campaigners say there are a lot of parallels to be drawn between bioweapons and lethal autonomous weapons. The U.S. and Russia, which have two of the most advanced military forces in the world, realized it was in neither of their interests to let other countries have cheap and accessible bioweapons, Javorsky said.
"I think that is very comparable with the thinking around these small embodiment systems that target people," she explained. "These are cheap and scalable so if you're a leading military it's not in your best interest to have scalable weapons that can easily proliferate and be used against you."