• Home
  • News
  • Fortune 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
Tech

Doctors who used AI assistance in procedures became 20% worse at spotting abnormalities on their own, study finds, raising concern about overreliance

Sasha Rogelberg
By
Sasha Rogelberg
Sasha Rogelberg
Reporter
Down Arrow Button Icon
Sasha Rogelberg
By
Sasha Rogelberg
Sasha Rogelberg
Reporter
Down Arrow Button Icon
August 26, 2025, 7:00 AM ET
A doctor in an operating room looks at a laptop. Other doctors are gathered in the background.
New research indicates endoscopists introduced to AI tools performed procedures less effectively when those tools were no longer available.Getty Images
  • Endoscopists introduced to AI-assistance tools during colonoscopies had a lower rate of detecting abnormalities after having said tools taken away, according to a study published this month in the Lancet Gastroenterology & Hepatology journal. Dr. Marcin Romańczyk, who conducted the study, said the results were a surprise, and he speculated that the decrease in detection rates were, in part, a result of overreliance on AI. In critical sectors like aviation where lives are at stake, there has been previous evidence of professionals relying too much on automation at the expense of safety.

Artificial intelligence may be a promising way to boost workplace productivity, but leaning on the technology too hard may prevent professionals from keeping their own skills sharp. More specifically, it sounds like AI might be making some doctors worse at detecting irregularities during routine screenings, new research finds, raising concerns about specialists relying too much on the technology.

Recommended Video

A study published in the Lancet Gastroenterology & Hepatology journal this month found that in 1,443 patients who underwent colonoscopies with and without AI-assisted systems, endoscopists introduced to an AI-assistance system went from detecting potential polyps at a rate of 28.4% with the technology to 22.4% after they no longer had access to the AI tools they were introduced to—a 20% drop in detection rates. 

The doctors’ failure to detect as many polyps on the colon when they were no longer using AI assistance was a surprise to Dr. Marcin Romańczyk, a gastroenterologist at H-T. Medical Center in Tychy, Poland, and the study’s author. The results not only call into question a potential laziness developing as a result of an overreliance on AI, but also the changing relationship between medical practitioners and a longstanding tradition of analog training.

“We were taught medicine from books and from our mentors. We were observing them. They were telling us what to do,” Romańczyk said. “And now there’s some artificial object suggesting what we should do, where we should look, and actually we don’t know how to behave in that particular case.”

Beyond the increased use of AI in operating rooms and doctors offices, the proliferation of automation in the workplace has brought with it lofty hopes of enhancing workplace performance. Goldman Sachs predicted last year the technology could increase productivity by 25%. However, emerging research has also warned of the pitfalls of adopting AI tools without consideration of its negative effects. A study from Microsoft and Carnegie Mellon University earlier this year found that among surveyed knowledge workers, AI increased work efficiency, but reduced critical engagement with content, atrophying judgment skills.

Romańczyk’s study contributes to this growing body of research questioning humans’ ability to use AI without compromising their own skillset. In his study, AI systems helped identify polyps on the colon by putting a green box around the region where an abnormality would be. To be sure, Romańczyk and his team did measure why endoscopists behaved this way because they did not anticipate this outcome and therefore did not collect data on why this happened. 

Instead, Romańczyk speculates that endoscopists became so used to looking for the green box that when the technology was no longer there, the specialists did not have that cue to pay attention to certain areas. He called this the “Google Maps effect,” likening his research results to the changes drivers made transitioning from the era of paper maps to that of GPS: Many people now rely on automation to show the most efficient route, when 20 years ago, one had to find out that route for themselves.

Checks and balances on AI

The real-life consequences of automation atrophying human critical skills are already well-established.

In 2009, Air France Flight 447 en route from Rio de Janeiro to Paris fell into the Atlantic Ocean, killing all 228 passengers and flight crew members on board. An investigation found the plane’s autopilot had been disconnected, ice crystals had disrupted its airspeed sensors, and the aircraft’s automated “flight director” was giving inaccurate information. The flight personnel, however, were not effectively trained in how to fly manually in these conditions and took the automated flight director’s faulty directions instead of making the appropriate corrections. The Air France accident is one of several in which humans were not property trained, relying instead on automated aircraft features.

“We are seeing a situation where we have pilots that can’t understand what the airplane is doing unless a computer interprets it for them,” William Voss, president of the Flight Safety Foundation, said at the time of the Air France investigation. “This isn’t a problem that is unique to Airbus or unique to Air France. It’s a new training challenge that the whole industry has to face.”

These incidents bring periods of reckoning, particularly for critical sectors where human lives are at stake, according to Lynn Wu, associate professor of operations, information, and decisions at University of Pennsylvania’s Wharton School. While industries should be leaning into technology, she said, the onus to make sure humans are appropriately adopting it should be on the institutions. 

“What is important is that we learn from this history of aviation and the prior generation of automation, that AI absolutely can boost performance,” Wu told Fortune. “But at the same time, we have to maintain those critical skills, such that when AI is not working, we know how to take over.”

Similarly, Romańczyk doesn’t eschew the presence of AI in medicine. 

“AI will be, or is, part of our life, whether we like it or not,” he said. “We are not trying to say that AI is bad and [to stop using] it. Rather, we are saying we should all try to investigate what’s happening inside our brains, how we are affected by it? How can we actually effectively use it?”

If professionals and specialists want to continue to use automation to enhance their work, it behooves them to retain their set of critical skills, Wu said. AI relies on human data to train itself, meaning if its training is faulty, so, too, will be its output.

“Once we become really bad at it, AI will also become really bad,” Wu said. “We have to be better in order for AI to be better.”

Fortune Brainstorm AI returns to San Francisco Dec. 8–9 to convene the smartest people we know—technologists, entrepreneurs, Fortune Global 500 executives, investors, policymakers, and the brilliant minds in between—to explore and interrogate the most pressing questions about AI at another pivotal moment. Register here.
About the Author
Sasha Rogelberg
By Sasha RogelbergReporter
LinkedIn iconTwitter icon

Sasha Rogelberg is a reporter and former editorial fellow on the news desk at Fortune, covering retail and the intersection of business and popular culture.

See full bioRight Arrow Button Icon

Latest in Tech

Greg Peters
Big TechMedia
Top analyst says Netflix’s $72 billion bet on Warner Bros. isn’t about the ‘Death of Hollywood’ at all. It’s really about Google
By Nick LichtenbergDecember 5, 2025
31 minutes ago
Elon Musk, wearing a suit and in front of a dark blue background, looks to the side and frowns.
Big TechTesla
Elon Musk says Tesla owners will soon be able to text while driving, despite it being illegal in nearly all 50 states
By Sasha RogelbergDecember 5, 2025
51 minutes ago
Mark Zuckerberg, chief executive officer of Meta Platforms Inc., during the Meta Connect event in Menlo Park, California, US, on Wednesday, Sept. 27, 2023. Meta Platforms Inc. introduced its latest lineup of head-worn devices, staking fresh claim to the virtual and augmented-reality industry just ahead of Apple Inc. pushing into the market. Photographer: David Paul Morris/Bloomberg via Getty Images
Big TechMeta
Mark Zuckerberg rebranded Facebook for the metaverse. Four years and $70 billion in losses later, he’s moving on
By Eva RoytburgDecember 5, 2025
2 hours ago
Construction workers are getting a salary bump for working on data center projects during the AI boom.
AIU.S. economy
Construction workers are earning up to 30% more and some are nabbing six-figure salaries in the data center boom
By Nino PaoliDecember 5, 2025
3 hours ago
Robert F. Kennedy
PoliticsHealth
Robert F. Kennedy Jr. turns to AI to make America healthy again
By Ali Swenson and The Associated PressDecember 5, 2025
4 hours ago
Steve Jobs holds up the first iPod Nano
Big TechApple
Apple is experiencing its biggest leadership shakeup since Steve Jobs died, with over half a dozen key executives headed for the exits
By Dave SmithDecember 5, 2025
4 hours ago

Most Popular

placeholder alt text
Economy
Two months into the new fiscal year and the U.S. government is already spending more than $10 billion a week servicing national debt
By Eleanor PringleDecember 4, 2025
1 day ago
placeholder alt text
Success
‘Godfather of AI’ says Bill Gates and Elon Musk are right about the future of work—but he predicts mass unemployment is on its way
By Preston ForeDecember 4, 2025
1 day ago
placeholder alt text
Success
Nearly 4 million new manufacturing jobs are coming to America as boomers retire—but it's the one trade job Gen Z doesn't want
By Emma BurleighDecember 4, 2025
1 day ago
placeholder alt text
Success
Nvidia CEO Jensen Huang admits he works 7 days a week, including holidays, in a constant 'state of anxiety' out of fear of going bankrupt
By Jessica CoacciDecember 4, 2025
1 day ago
placeholder alt text
Economy
Tariffs and the $38 trillion national debt: Kevin Hassett sees ’big reductions’ in deficit while Scott Bessent sees a ‘shrinking ice cube’
By Nick LichtenbergDecember 4, 2025
1 day ago
placeholder alt text
Real Estate
‘There is no Mamdani effect’: Manhattan luxury home sales surge after mayoral election, undercutting predictions of doom and escape to Florida
By Sasha RogelbergDecember 4, 2025
1 day ago
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.