News update
  • Customers thronge Feni cattle markets ahead of Eid     |     
  • Over 1.5 million foreign pilgrims are in Saudi Arabia for Hajj     |     
  • UN Sounds Alarm on Plastic Pollution Crisis     |     
  • US vetoes UN call for permanent Gaza ceasefire     |     

Push Grows for Global Ban on AI-Powered ‘Killer Robots’

By Conor Lennon Technology 2025-06-01, 8:30pm

11-6512bd43d9caa6e02c990b0a82652dca1748788240.jpg

A drone flies over Mount Tamalpais in the USA. Unsplash/Ian Usher



A future where algorithms decide who lives or dies is no longer science fiction. Artificial Intelligence (AI)-powered drones are transforming warfare, raising urgent ethical and legal concerns. As these technologies rapidly evolve, international pressure is mounting to regulate so-called “killer robots” before their use becomes widespread and irreversible.

Each day, we willingly share personal data with machines—clicking “agree” on cookies or using search engines—often without thinking. But what happens when machines use this data not just to sell us products, but to select human targets for lethal attacks?

That disturbing prospect is at the heart of growing calls from the United Nations and NGOs for international regulation of Lethal Autonomous Weapons Systems (LAWS). These weapons, capable of operating without human oversight, may soon decide autonomously who to kill.

Recent events highlight this urgency. Ukraine’s Kherson region has endured months of drone attacks by Russian forces, reportedly killing over 150 civilians and injuring hundreds. A UN-appointed investigation found these attacks amount to crimes against humanity. Meanwhile, Ukraine itself is developing a “drone wall” of armed UAVs to defend its borders—demonstrating how even low-cost drones can be adapted for deadly use.

Where once only wealthy nations deployed high-tech UAVs, AI-driven weapons are now reshaping conflict worldwide, lowering the barrier to autonomous warfare and blurring lines of accountability.

UN officials and human rights organisations are sounding the alarm.

“Machines should never decide to take human life. That is morally repugnant,” said Izumi Nakamitsu, UN Under-Secretary-General and head of the Office for Disarmament Affairs. “Such weapons must be banned by international law.”

Human Rights Watch describes the rise of autonomous weapons as a dangerous extension of “digital dehumanisation”—where AI makes life-changing decisions in policing, law enforcement, and warfare.

“Countries like the US, Russia, China, Israel, and South Korea are racing to develop autonomous weapons,” warned Mary Wareham, Advocacy Director at Human Rights Watch. “AI is already being integrated into land, air, and sea-based systems.”

Proponents argue that AI can outperform humans by identifying threats with speed and precision, free from fatigue or emotion. But critics highlight the dangers. The technology remains flawed—prone to misidentifying civilians as combatants, especially vulnerable populations like people with disabilities or individuals with darker skin tones.

“AI inherits the biases of its programmers,” said Wareham. “Facial recognition errors could have deadly consequences.”

Ethically, autonomous weapons also present an accountability vacuum.

“Who is responsible if a robot commits a war crime?” asked Nicole Van Rooijen, Executive Director of the Stop Killer Robots coalition. “The manufacturer? The coder? The military? Without clear answers, justice becomes impossible.”

Efforts to regulate LAWS are not new. Talks began in 2014 under the UN’s Convention on Certain Conventional Weapons (CCW), even before autonomous systems were used in warfare. Today, with battlefield deployment a reality, the push for legally binding rules has gained momentum.

UN Secretary-General António Guterres has called on Member States to reach an agreement by 2026. At least 120 countries now support negotiating a new international treaty banning fully autonomous weapons. Tech workers, peace laureates, faith leaders, and AI experts have all joined the campaign.

Despite years of discussions, nations have yet to agree on a definition of autonomous weapons—let alone regulations. Still, optimism remains. A “rolling text” proposed by the current CCW chair may provide a basis for progress, if political will exists.

“We’re not yet negotiating a treaty,” Van Rooijen acknowledged. “But the groundwork is being laid.”

“There is growing consensus,” said Nakamitsu. “When it comes to war, someone must be held accountable. Machines cannot be trusted with that responsibility.”