89 nations signatory to the United Nations Convention on Conventional Weapons (CCW) have voted to convene groups of governmental experts twice in 2017 to discuss the implications of autonomous weapons, which would choose targets without human oversight. According to Human Rights Watch, the agreement could move the international body towards a broad ban on weapons directed by artificial intelligence.
Autonomous weapons systems—or, more colloquially, killer robots—have been the subject of intense criticism from tech leaders and global thinkers including Elon Musk, Stephen Hawking, and Steve Wozniak. In part, they argue that warfare conducted by robot would risk civilian casualties with relatively little oversight.
Get Data Sheet, Fortune’s technology newsletter.
According to the Campaign to Stop Killer Robots, culpability for those deaths would also be unclear. Much as in the case of driverless cars, the death of a civilian at the hands of a weaponized robot could be variously blamed on its programmer, manufacturer, or operator.
According to Human Rights Watch’s Steve Goose, the development of killer robots would open a Pandora’s Box with irreversible consequences. “Once these weapons exist,” he said in a statement, “there will be no stopping them. The time to act on a pre-emptive ban is now.”
But, again paralleling debates about self-driving cars, others argue that robots could actually decrease civilian casualties. Among other advantages, robots would not act out of fear for their own safety, and could make swift decisions with more clarity than human soldiers.
For more on the ethics of war robots, watch our video.
Some autonomous weapons systems are already in various stages of development, though it’s important to distinguish them from drones remotely controlled by human operators. One publicly known autonomous weapon program is Israel Aerospace Industries’ Harop drone, designed to autonomously target enemy air-defense systems. Sentry guns with autonomous capabilities are already in use in South Korea and Israel, though those weapons are not currently used in autonomous mode.
Similarly, several platforms built by Google’s Boston Dynamics with U.S. defense funding could be weaponized, though they’re publicly touted as rescue robots.