• Home
  • Latest
  • Fortune 500
  • Finance
  • Tech
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
TechRobots

Killer robots are the future of warfare and the ‘inevitable next step’ in Russia’s long bloody invasion of Ukraine

By
James Dawes
James Dawes
and
The Conversation
Down Arrow Button Icon
February 21, 2023, 3:11 PM ET
Evacuation robot (unmanned ground vehicle) THeMIS seen on a dusty road during the field tests.
Evacuation robot (unmanned ground vehicle) THeMIS seen on a dusty road during the field tests.Mykhaylo Palinchak—SOPA Images/LightRocket/Getty Images

The U.S. military is intensifying its commitment to the development and use of autonomous weapons, as confirmed by an update to a Department of Defense directive. The update, released Jan. 25, 2023, is the first in a decade to focus on artificial intelligence autonomous weapons. It follows a related implementation plan released by NATO on Oct. 13, 2022, that is aimed at preserving the alliance’s “technological edge” in what are sometimes called “killer robots.”

Both announcements reflect a crucial lesson militaries around the world have learned from recent combat operations in Ukraine and Nagorno-Karabakh: Weaponized artificial intelligence is the future of warfare.

“We know that commanders are seeing a military value in loitering munitions in Ukraine,” Richard Moyes, director of Article 36, a humanitarian organization focused on reducing harm from weapons, told me in an interview. These weapons, which are a cross between a bomb and a drone, can hover for extended periods while waiting for a target. For now, such semiautonomous missiles are generally being operated with significant human control over key decisions, he said.

Pressure of war

But as casualties mount in Ukraine, so does the pressure to achieve decisive battlefield advantages with fully autonomous weapons—robots that can choose, hunt down, and attack their targets all on their own, without needing any human supervision.

This month, a key Russian manufacturer announced plans to develop a new combat version of its Marker reconnaissance robot, an uncrewed ground vehicle, to augment existing forces in Ukraine. Fully autonomous drones are already being used to defend Ukrainian energy facilities from other drones. Wahid Nawabi, CEO of the U.S. defense contractor that manufactures the semiautonomous Switchblade drone, said the technology is already within reach to convert these weapons to become fully autonomous.

Mykhailo Fedorov, Ukraine’s digital transformation minister, has argued that fully autonomous weapons are the war’s “logical and inevitable next step” and recently said that soldiers might see them on the battlefield in the next six months.

Proponents of fully autonomous weapons systems argue that the technology will keep soldiers out of harm’s way by keeping them off the battlefield. They will also allow for military decisions to be made at superhuman speed, allowing for radically improved defensive capabilities.

Currently, semiautonomous weapons, like loitering munitions that track and detonate themselves on targets, require a “human in the loop.” They can recommend actions but require their operators to initiate them.

By contrast, fully autonomous drones, like the so-called “drone hunters” now deployed in Ukraine, can track and disable incoming unmanned aerial vehicles day and night, with no need for operator intervention and faster than human-controlled weapons systems.

Calling for a timeout

Critics like the Campaign to Stop Killer Robots have been advocating for more than a decade to ban research and development of autonomous weapons systems. They point to a future where autonomous weapons systems are designed specifically to target humans, not just vehicles, infrastructure, and other weapons. They argue that wartime decisions over life and death must remain in human hands. Turning them over to an algorithm amounts to the ultimate form of digital dehumanization.

Together with Human Rights Watch, the Campaign to Stop Killer Robots argues that autonomous weapons systems lack the human judgment necessary to distinguish between civilians and legitimate military targets. They also lower the threshold to war by reducing the perceived risks, and they erode meaningful human control over what happens on the battlefield.

The organizations argue that the militaries investing most heavily in autonomous weapons systems, including the U.S., Russia, China, South Korea, and the European Union, are launching the world into a costly and destabilizing new arms race. One consequence could be this dangerous new technology falling into the hands of terrorists and others outside of government control.

The updated Department of Defense directive tries to address some of the key concerns. It declares that the U.S. will use autonomous weapons systems with “appropriate levels of human judgment over the use of force.” Human Rights Watch issued a statement saying that the new directive fails to make clear what the phrase “appropriate level” means and doesn’t establish guidelines for who should determine it.

But as Gregory Allen, an expert from the national defense and international relations think tank Center for Strategic and International Studies, argues, this language establishes a lower threshold than the “meaningful human control” demanded by critics. The Defense Department’s wording, he points out, allows for the possibility that in certain cases, such as with surveillance aircraft, the level of human control considered appropriate “may be little to none.”

The updated directive also includes language promising ethical use of autonomous weapons systems, specifically by establishing a system of oversight for developing and employing the technology, and by insisting that the weapons will be used in accordance with existing international laws of war. But Article 36’s Moyes noted that international law currently does not provide an adequate framework for understanding, much less regulating, the concept of weapon autonomy.

The current legal framework does not make it clear, for instance, that commanders are responsible for understanding what will trigger the systems that they use, or that they must limit the area and time over which those systems will operate. “The danger is that there is not a bright line between where we are now and where we have accepted the unacceptable,” said Moyes.

Impossible balance?

The Pentagon’s update demonstrates a simultaneous commitment to deploying autonomous weapons systems and to complying with international humanitarian law. How the U.S. will balance these commitments, and if such a balance is even possible, remains to be seen.

The International Committee of the Red Cross, the custodian of international humanitarian law, insists that the legal obligations of commanders and operators “cannot be transferred to a machine, algorithm or weapon system.” Right now, human beings are held responsible for protecting civilians and limiting combat damage by making sure the use of force is proportional to military objectives.

If and when artificially intelligent weapons are deployed on the battlefield, who should be held responsible when needless civilian deaths occur? There isn’t a clear answer to that very important question.

James Dawes is a professor of English at Macalester College.

Learn how to navigate and strengthen trust in your business with The Trust Factor, a weekly newsletter examining what leaders need to succeed. Sign up here.
About the Authors
By James Dawes
See full bioRight Arrow Button Icon
By The Conversation
See full bioRight Arrow Button Icon

Latest in Tech

Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025

Most Popular

Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.


Most Popular

placeholder alt text
Success
Billionaire philanthropy's growing divide: Mark Zuckerberg stops funding immigration reform as MacKenzie Scott doubles down on DEI
By Ashley LutzDecember 22, 2025
14 hours ago
placeholder alt text
Future of Work
Meet a 55-year-old automotive technician in Arkansas who didn’t care if his kids went to college: ‘There are options’
By Muskaan ArshadDecember 21, 2025
2 days ago
placeholder alt text
Success
Former U.S. Secret Service agent says bringing your authentic self to work stifles teamwork: 'You don’t get high performers, you get sloppiness'
By Sydney LakeDecember 22, 2025
16 hours ago
placeholder alt text
Travel & Leisure
After pouring $450 million into Florida real estate, Larry Ellison plans to lure the ultrarich to an exclusive town just minutes from Mar-a-Lago
By Marco Quiroz-GutierrezDecember 22, 2025
18 hours ago
placeholder alt text
Success
Multimillionaire musician Will.i.am says work-life balance is for people 'working on someone else’s dream'—he grinds from 5-to-9 after his 9-to-5
By Orianna Rosa RoyleDecember 21, 2025
2 days ago
placeholder alt text
Economy
Mitt Romney says the U.S. is on a cliff—and taxing the rich is now necessary 'given the magnitude of our national debt'
By Dave SmithDecember 22, 2025
15 hours ago

Latest in Tech

AIautonomy
Waymo chaos during San Francisco power outage likely due to ‘operational management failure’ instead of software flaw, expert says
By Jaimie Ding and The Associated PressDecember 22, 2025
9 hours ago
BankingBank of America
Bank of America’s Moynihan says AI’s economic benefit is ‘kicking in more’
By Katherine Chiglinsky, Steve Dickson and BloombergDecember 22, 2025
12 hours ago
man in suit
Personal FinanceCryptocurrency
Notorious crypto conman Sam Bankman-Fried has a prison passion project: giving legal advice to other inmates
By Carlos GarciaDecember 22, 2025
12 hours ago
AI nude
CybersecurityEducation
13-year-old girl attacked a boy showing an AI-generated nude image of her. She was expelled
By Heather Hollingsworth, Jack Brook and The Associated PressDecember 22, 2025
13 hours ago
AITech
In 2000 Larry Page said Google was ‘nowhere near’ the ultimate search engine—25 years later, Gemini might be close
By Marco Quiroz-GutierrezDecember 22, 2025
13 hours ago
Photo of Colin Angle
InnovationAutomation
‘It’s a cage match’: Beleaguered iRobot founder says the biggest reason why the Roomba-maker failed was because of growing Chinese competition
By Sasha RogelbergDecember 22, 2025
15 hours ago