It may have seemed like a United Nations conclave, but a recent meeting in Geneva was followed intently by experts in artificial intelligence, military strategy, disarmament and humanitarian law.
The reason? Killer robots — drones, guns and bombs that decide on their own, with artificial brains, whether to attack and kill — and what should be done, if anything, to regulate or ban them.
Once the domain of science fiction films such as the Terminator series and RoboCop, killer robots, technically known as Lethal Autonomous Weapons Systems, have been invented and tested at an accelerated pace with little oversight. Some prototypes have even been used in actual conflicts.
The evolution of these machines is considered a potentially seismic event in warfare, akin to the invention of gunpowder and nuclear bombs. This year, for the first time, a majority of the 125 nations that belong to an agreement called the Convention on Certain Conventional Weapons said they wanted curbs on killer robots. But they were opposed by members that are developing these weapons, most notably the US and Russia. The group’s conference concluded with only a vague statement about considering possible measures acceptable to all. The Campaign to Stop Killer Robots, a disarmament group, said the outcome fell “drastically short”.
What is the Convention on Certain Conventional Weapons?
Sometimes known as the Inhumane Weapons Convention, it is a framework of rules that ban or restrict weapons considered to cause unnecessary, unjustifiable and indiscriminate suffering, such as incendiary explosives, blinding lasers and booby traps that don’t distinguish between fighters and civilians. The convention has no provisions for killer robots.
What exactly are killer robots?
They are widely considered to be weapons that make decisions with little or no human involvement. Rapid improvements in robotics, AI and image recognition are making such armaments possible.
Why are they considered attractive?
The weapons offer the promise of keeping soldiers out of harm’s way, and making faster decisions than a human would, by allowing autonomous systems like pilotless drones and driverless tanks that independently decide when to strike.
What are the objections to such weapons?
Critics argue it is morally repugnant to assign lethal decision making to machines, regardless of technical sophistication. How does a machine differentiate an adult from a child, a fighter with a bazooka from a civilian with a broom, a hostile combatant from a wounded or surrendering soldier?
“Fundamentally, autonomous weapon systems raise ethical concerns for society about substituting human decisions about life and death with sensor, software and machine processes,” Peter Maurer, president of the International Committee of the Red Cross and an outspoken opponent of killer robots, told the Geneva conference.
Why was the Geneva conference important?
The conference was widely considered by disarmament experts to be the best opportunity to devise ways to regulate, if not prohibit, the use of killer robots. It was the culmination of years of discussions by a group of experts who had been asked to identify the challenges and possible approaches to reducing the threats from killer robots.
What do opponents of a new treaty say?
Some, like Russia, insist that any decisions on limits must be unanimous — in effect giving opponents a veto. The US argues that existing international laws are sufficient and that banning autonomous weapons technology would be premature. The chief US delegate to the conference, Joshua Dorosin, proposed a nonbinding “code of conduct” for use of killer robots — an idea that disarmament advocates dismissed as a delaying tactic.
Where have autonomous weapons been used?
In March, UN investigators said a “lethal autonomous weapons system” had been used by government-backed forces in Libya against militia fighters. A drone called Kargu-2, made by a Turkish defence contractor, tracked and attacked the fighters as they fled a rocket attack. In the 2020 war in Nagorno-Karabakh, Azerbaijan fought Armenia with attack drones and missiles that loiter in the air until detecting the signal of an assigned target.
What happens now?
Disarmament advocates said that the outcome of the conclave had hardened what they described as a resolve to push for a new treaty in the next few years, such as those that prohibit land mines and cluster munitions.
NYTNS