MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Saturday, 23 November 2024

Killer tech: Editorial on the use of artificial intelligence during war

Many militaries have developed unmanned land vehicles and are developing robot soldiers

The Editorial Board Published 22.02.23, 05:08 AM
The same technology that could help curb the costs of war on societies could otherwise amplify them.

The same technology that could help curb the costs of war on societies could otherwise amplify them. File Photo

Artificial intelligence can crack tough tests, answer nuanced questions, and perform many of the more mundane tasks of the modern labour force. But should the excitement around AI be tempered with caution when the technology is used for military purposes? That question was at the heart of a conference in the Netherlands last week that debated whether the use of AI in wars needs some guard rails. In many ways, it is already too late to be asking this question. The war in Ukraine has served as a laboratory for AI-driven tools. Moscow has deployed deadly suicide drones to target Kyiv and other Ukrainian cities. Meanwhile, Ukraine has used unmanned, autonomous sailing drones to carry out at least one audacious attack on Russia’s Black Sea fleet. AI-based software has also helped Ukraine track the movement of its enemy. None of this should come as a surprise. The world’s major militaries have been developing autonomous weaponry for some years now. The United States of America and China have already built giant, unmanned warships capable of travelling long distances and firing missiles. Many militaries have developed unmanned land vehicles and are developing robot soldiers. The advantages of unmanned and autonomous weapons systems are obvious: they can, in theory, reduce the body bag count of those militaries that deploy them and eliminate some of the risks of human error that battle fatigue, anger or frustration could cause.

But the dangers that these systems pose are too significant to be ignored. Without adequate care, AI-led systems could turn war that is all too real for those at the receiving end of bullets and missiles into a video game for those firing them. Eliminating human judgement from life-and-death decisions is a choice fraught with ethical, legal and strategic consequences. If an unmanned ship shoots and kills innocent civilians, who should be held accountable for war crimes? Can military leaders be certain that autonomous systems that run on machine-learning would not turn on their own masters, or get hacked by the enemy? It is for this reason that every quantum leap in technology requires a global set of rules governing its use. The world needs an agreement laying out the limits on the use of AI for military purposes, just like the Nuclear Non-Proliferation Treaty, the Chemical Weapons Convention and the Outer Space Treaty. The same technology that could help curb the costs of war on societies could otherwise amplify them.

Follow us on:
ADVERTISEMENT
ADVERTISEMENT