The world just blew a ‘historic opportunity’ to stop killer robots—and that might be a good thing

December 22, 2021, 2:00 PM UTC

It was billed as “a historic opportunity” to stop killer robots. It failed.

That was the alarming news out of a United Nations disarmament committee held in Geneva at the end of last week. The committee had spent eight years debating what, if anything, to do about the rapid development of weapons that use artificial intelligence to locate, track, attack, and kill targets without human intervention. Many countries want to see such weapons banned, but the UN group operates by consensus and several states, including the U.S., Russia, the United Kingdom, India, and Israel, were opposed to any legally binding restrictions.

In the end, the committee could agree only to keep talking, with no clear objective for their future discussions. This kicking-of-the-can was an outcome that the U.S. representative to the UN committee called “a dream mandate” for all countries because it did not foreclose any particular outcome.

Many other nations and activists saw that outcome as something quite different—a woefully inadequate response.

“It was a complete failure, a disaster really,” said Noel Sharkey, an emeritus professor of robotics and A.I. at the University of Sheffield, in the U.K., and one of several spokespersons for the Stop Killer Robots campaign.

“It was a car crash,” Verity Coyle, a senior adviser to Amnesty International focused on its campaign for a ban of the weapons, said of the UN committee’s meeting.

Not science fiction any more

A.I.-guided weapons were once the stuff of science fiction—and were still largely in that realm when the UN committee first began talking about autonomous weapons in 2014. But real systems with the ability to select targets without human oversight are now starting to be deployed on battlefields around the globe. Civil society groups and scientists who are convinced that they pose a grave danger have dubbed them “killer robots.” Their more technical moniker is lethal autonomous weapons, or LAWS.

A UN report on the Libyan civil war said earlier this year that an autonomous weapon, the Kargu-2 quadcopter, produced by a company in Turkey, was likely used to track and target fleeing fighters in one engagement. There have been reports of similar autonomous munitions being used by Azerbaijan in its recent war with Armenia, as well as autonomous weapons guns being deployed by Israel in its most recent conflict with Hamas.

These developments have led many to fear that the world is running out of time to take action to stop or slow the widespread use of these weapons. “The pace of technology is really beginning to outpace the rate of diplomatic talks,” said Clare Conboy, another spokesperson for the Stop Killer Robots campaign.

Silver lining

While campaigners were bitterly disappointed with results of this month’s meetings of the UN committee, some say its failure may counterintuitively present the best opportunity in years for effective international action to restrict their development. That’s because it provides an opportunity to move the discussion of an international treaty limiting LAWS to a different diplomatic venue where a handful of states won’t be able to thwart progress.

“I think it is an exciting moment for those states calling for a legally binding instrument to come together and think about what is the best next step,” Coyle said. “The 60-odd countries that are in favor of a ban need to take a decision on starting a parallel process, somewhere where consensus rules could not be used to block the will of the majority.”

Coyle said that discussions at the UN committee, although failing to reach an agreement, had at least helped to raise awareness of the danger posed by autonomous weapons. A growing number of nations have come out in favor of a ban, including New Zealand, Germany, Austria, Norway, the Netherlands, Pakistan, China, Spain, and the 55 countries that make up the African Union.

In addition, thousands of computer scientists and artificial intelligence researchers have signed petitions calling for a ban on autonomous weapons and pledged not to work on developing them. Now, Coyle said, it was important to take that momentum and use it in another forum to push for a ban.

Coyle said that an advantage of taking the discussion outside the UN’s Convention on Certain Conventional Weapons, or CCW—the UN committee that has been discussing LAWS regulation for the better part of a decade—is that the use of these weapons by law enforcement agencies and in civil wars is outside the scope of that UN committee’s mandate. Those potential uses of killer robots are of grave concern to many civil society groups, including Amnesty International and Human Rights Watch.

Campaigners say there are several possible alternatives to the CCW. Coyle said that the Group of Eight industrialized nations might become a forum for further discussion. Another option would be for one of the states currently in favor of a ban to try to push something through the UN General Assembly.

Coyle and Sharkey also both pointed to the process that led to the international treaty prohibiting the use of anti-personnel land mines, and a similar convention barring the use of cluster munitions, as potential models for how to achieve a binding international treaty outside the formal UN process.

In both of those examples, a single nation—Canada for land mines, Norway for cluster munitions—agreed to host international negotiations. Among the countries that some campaigners think might be persuaded to take on that role for LAWS are New Zealand, whose government in November committed the country to playing “a leadership role” in pushing for a ban. Others include Norway, Germany, and the Netherlands, whose governments have all made similar statements over the past several months.

Hosting such negotiations requires a large financial commitment from one country, running into perhaps millions of dollars, Sharkey said, which is one reason that the African Union, for instance, which has come out in favor of a ban, is unlikely to volunteer to host an international negotiation process.

Another drawback of this approach is that whatever treaty is developed would be binding only on those countries that choose to sign it, rather than something that might cover all UN members or all signatories to the Geneva Convention. The U.S., for instance, has not acceded to either the land mine or the cluster munitions treaties.

But advocates of this approach note that these treaties establish an international norm and exert a high degree of moral pressure even on those countries that decline to sign them. The land mine treaty resulted in many arms makers ceasing production of the weapons, and the U.S. government in 2014 promised not to use the munitions outside of the Korean Peninsula, where American military planners have argued land mines are essential to the defense of South Korea against a possible invasion by North Korea.


Not everyone is convinced a process outside the CCW will work this time around. For one thing, countries that fear a particular adversary will acquire LAWS are unlikely to agree to unilaterally abandon the deterrent of having that capability themselves, said Robert Trager, a professor of international relations at the University of California, Los Angeles, who attended last week’s Geneva discussions as a representative of the Center for the Governance of AI.

Trager also noted that in the case of land mines and cluster munitions, the use—and, critically, the limitations—of those technologies were well established at the time treaties were negotiated. Countries understood exactly what they were giving up. That is not the case with LAWS, which are only just being developed and have barely ever been deployed, he said.

Max Tegmark, a physics professor at MIT and cofounder of the Future of Life Institute, which seeks to address “existential risks” to humanity, and Stuart Russell, an A.I. researcher at the University of California at Berkeley, have proposed that some countries currently standing in the way of binding restrictions on killer robots might be persuaded to support a ban on autonomous weapons below a certain size or weight threshold that are designed to primarily target individual people.

Tegmark said that these “slaughterbots,” which might be small drones that could attack in swarms, would essentially represent “a poor man’s weapon of mass destruction.” They could be deployed by terrorists or criminals to either commit mass murder or assassinate individuals, such as judges or politicians. This would be highly destabilizing to the existing international order and so existing powers, such as the U.S. and Russia, ought to be in favor of banning or restricting the proliferation of slaughterbots, he said.

Progress toward a ban had been made more difficult by the conflation of these small autonomous weapons with larger systems designed to attack ships, aircraft, tanks, or buildings. Established powers would be more hesitant to give up these weapons, he said.

Sharkey said that Tegmark’s and Russell’s position represented a fringe view and was not the position of the Campaign to Stop Killer Robots. He said civil society groups were just as concerned about larger autonomous weapons, such as A.I.-piloted drones that could take off from the U.S. and fly across oceans to bomb targets, possibly killing civilians in the process, as they were about smaller systems that could target individual people.

Campaigners also say they worry about the sincerity of some of the countries that have said they support a binding legal agreement restricting LAWS. For instance, China has supported a binding legal agreement at the CCW, but has also sought to define autonomous weapons so narrowly that much of the A.I.-enabled military equipment it is currently developing would fall outside the scope of such a ban. China has sold drones with autonomous attack capabilities to other countries, including Pakistan. Pakistan has also been a leading proponent of a binding agreement at the CCW, but has taken the position that without a ban that would cover all countries, it needs such weapons to counter India’s development of similar systems.

Never miss a story: Follow your favorite topics and authors to get a personalized email with the journalism that matters most to you.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward