MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Tuesday, 05 November 2024

Editorial: Terminators

In era of AI-enabled weapons, rules of conflict need updating

The Editorial Board Published 31.12.21, 12:21 AM
Representational image.

Representational image. File photo

Would shooting someone dead be considered murder if no one pulled the trigger? That is the macabre, but increasingly realistic question, at the heart of a heated debate over the future of warfare amidst the rapid advance of ‘killer robot’ weapons — drones, guns, ships and more that are designed to attack people based on their own assessment, without human intervention. Earlier this month, at the United Nations in Geneva, a majority of the 125 nations that are party to the Convention on Certain Conventional Weapons, for the first time, supported regulations on these arms and platforms that were limited to the realm of science fiction until a few years ago. Officially, they are called lethal autonomous weapons systems. But key members of the UN Convention, who are leaders in developing these weapons, do not want any international law limiting them. The United States of America, Russia, the United Kingdom, India, Japan, South Korea, Australia and Israel have opposed mandatory restrictions on autonomous weapons.

The attraction of such artificial intelligence-driven military hardware is easy to comprehend. Countries can — in theory — be more surgically precise in militarily attacks. Deploying robots to do the job of soldiers also reduces the risk of body bags returning home, making it politically easier to pursue long wars in distant lands. But a range of moral and legal questions arise from this approach to military campaigns. Even AI developed by the world’s most advanced companies has been repeatedly shown to harbour the biases of those who develop the technology, be it facial recognition software misidentifying people of colour or ratings systems that systematically show women as poor performers. Can intelligent drones be trusted to tell the difference between a teenager with a toy gun and a young militant with a real weapon? What if extremists capture these platforms — like the Taliban have taken control of American weapons left behind in Afghanistan? And if autonomous systems commit a war crime, who does the international community bring to trial? These are not distant dilemmas: according to the UN, the world witnessed its first killings from LAWS in 2020 in the Libyan war. The devastation of the Second World War and the horrors of the atomic bombing of Hiroshima and Nagasaki forced the world to agree on the rules of conflict, codified as the Geneva Conventions. Those norms now need an update in the era of AI-enabled weapons. It should not need another war to convince the world.


Follow us on:
ADVERTISEMENT
ADVERTISEMENT