CEO DailyCFO DailyBroadsheetData SheetTerm Sheet

Over-the-counter sales may usher in a boom time for A.I.-based hearing aids

October 28, 2022, 9:39 PM UTC
BSIP/UIG Via Getty Images

Hello, and welcome to October’s special monthly edition of Eye on A.I. Kevin Kelleher here filling in for Jeremy.

The Food and Drug Administration’s move this month to make hearing aids available over the counter could help generate a boom in demand for a new breed of A.I.-powered hearing aids among the nearly 30 million Americans who struggle with hearing loss, executives at companies that make the devices told me.

In July, President Biden signed an executive order calling on the FDA to make hearing aids more easily available in stores. Last week, Walgreens, CVS, and Walmart began selling them directly to consumers. Previously, buying a hearing aid typically required a medical exam, a prescription, and other specialty evaluations. Audiologists or other service professionals would set the price of hearing aids, which could range between $90 and $7,000, according to the National Council on Aging.

The FDA estimates that about one in five people with mild to moderate hearing loss use hearing aids, partly because Medicare doesn’t pay for hearing aids, only diagnostic tests. Some patients also balked at the price of the devices or the involved process of getting fitted for one. The White House estimates that the average costs of hearing aids will fall by as much as $3,000, opening up the market to more consumers.

“The change by the FDA is one of the biggest changes in the medical-device industries in a generation,” says Andrew Song, founder and CEO of Whisper, a San Francisco-based maker of A.I.-based hearing aids. “It allows companies to develop more innovative products and it places more decision-making power in the hands of patients because hearing isn’t a one-size-fits-all thing.”

Song says Whisper’s hearing aids deploy A.I. in a sound-separation engine that processes all detected sounds to filter out background noise in public places like restaurants and open offices. It operates like self-driving cars using computer vision to see the world and navigate roadways. “We use A.I. in a similar way, but for the sense of hearing,” he says. “It’s kind of like computer listening.”

Whisper and other companies use deep-learning models that are trained on tens of thousands of hours of audio data in different environments and can be tailored to each patient’s individual preferences. Older hearing aids used directional microphones that needed to be manually turned on to emphasize nearby speakers and turned off to give the user an audio awareness of their spatial surroundings.

“The newer devices automatically detect when you’re in a quiet or noisy or musical or windy environment,” says Dave Fabry, chief innovation officer at Starkey, one of the largest hearing-aid makers. “Only then do they enable features like directionality, noise management, wind noise reduction, and feedback cancellation as appropriate.”

A.I. can also introduce other features into hearing aids that have traditionally not been available. Some of Starkey’s hearing aids have embedded sensors that can detect falls and notify trusted contacts via text (those with hearing loss are three times more likely at risk of falling, Fabry says.) The sensors can also monitor physical activity, counting daily steps to help address common co-morbidities of hearing loss such as heart disease.

Hearing aids can also have a Siri- or Alexa-like virtual assistant built in that can answer questions about the weather or offer reminders for everything from birthdays to when to take medications. “In the U.S., the adherence to chronic medication prescriptions is very low, less than 50%,” Fabry says. “Because people often forget when to take their pills”

Song and Fabry both believe that many hearing-loss patients will continue to seek out audiologists and other professionals for a fitting and ongoing maintenance. Over-the-counter options may appeal to those willing to do an online hearing test and do routine maintenance themselves—especially more tech-savvy users under age 60 who make up about a third of hearing-loss patients.

“Most people who have hearing loss just say it’s not that bad, that it just comes with aging, but really what they’re saying is that there is stigma associated with it,” Fabry says. “Some people are reluctant to go into clinical or medical facilities. And so any new channel like over the counter that creates more paths for people to procure hearing aids is a good thing.”

Fabry and Song expect the greater availability of hearing aids will not only lower some costs, but will also spur competition, which gives A.I.-based hearing aids a strong edge.

“Developing new products can take months, if not years, but I’m excited about what’s coming in the next 12, 18, 24 months,” says Song. “The real promise of A.I. is that it can enable what I think of as superhuman hearing for everyone. We’re not there today, but it seems more attainable than ever to prove that A.I. can help augment the very powerful engine that is the human brain.”

Thanks for reading.

Kevin Kelleher


The list of confirmed speakers for Fortune’s second annual Brainstorm A.I. conference keeps growing. Among the recently added speakers are Apple’s Yael Garten, Microsoft CTO Kevin Scott, and DeepMind’s Colin Murdoch. They’re part of an impressive line-up of A.I. luminaries participating in the event on Dec. 5th and 6th in San Francisco. Apply to attend today!


Meta is betting big on A.I. even as its revenue slows. Meta’s stock plunged 25% Thursday to its lowest level since 2016, thanks to a 4% decline in third-quarter revenue and net income that fell by more than half. Still, Meta is shoveling more capital into its A.I. technology. A.I. investments will make up substantially all of Meta’s capital expenditure growth next year, the company said. In the first nine months of 2022, capex rose 66% to $22.8 billion, and Facebook said it could rise another 71% to $39 billion in 2023. “Our current surge in capex is largely due to building our A.I. infrastructure,” CEO Mark Zuckerberg said in an earnings call. “Our A.I. discovery engine is playing an increasingly important role across our products,” he said. A.I. is powering Meta’s content recommendations to users, ad targeting, and business messaging. Zuckerberg said he expects capex to decline as a percentage of revenue “over the long term.”

Google quietly bought Alter, the maker of Facemoji A.I.-powered avatars, for $100 million. Google didn’t formally announce the acquisition, which occurred two months ago, according to an unnamed source cited by Techcrunch. Alter’s website is now offline and some Alter executives changed their LinkedIn profiles to say they now work at Google. Alter was founded in 2017 as Facemoji, using facial recognition technology to create a dynamic avatar that can stream a user’s facial expressions without using their actual likeness, a privacy feature favored by gamers and other users. Google, which confirmed the transaction but not the deal value to Techcrunch, aims to use Facemoji technology in new content offerings in the face of rising competition with TikTok and others.

Many corporate executives still struggle to harvest insights from A.I.-powered data analytics. While 95% of companies say they integrate artificial intelligence in predictive analytics in their marketing strategies, four in five marketing executives say they have a hard time making the right decisions from the data at their disposal, while 84% say their ability to predict consumer behavior feels like guesswork, a survey by Pecan AI revealed. In a separate study from Forrester Consulting, executives making data-driven decisions using machine learning cited operational roadblocks that could inhibit the deployment of A.I. models, including data transparency, the ability to trace data flows, and breaking down data silos between departments.

More cities are turning to A.I. technology to detect and locate gunshots to curb shooting incidents. Detroit's city council approved $7 million to expand an A.I.-powered gunshot-detection tool developed from ShotSpotter Inc., choosing it over other options such as programs to provide jobs and food and violence-interruption programs, Bloomberg reported. ShotSpotter places acoustic sensors in neighborhoods and uses software to identify and analyze gunshot sounds, which can then alert local police to respond to incidents that might otherwise not be reported. Raytheon Technologies and Aegis AI offer similar systems, some of which can be integrated in camera systems. While some dispute the accuracy and value of such systems, cities like Seattle and Cleveland have expanded the use of gunshot-detection systems, a market that could grow to $1 billion in 2026 from $650 million in 2020.

Our mission to make business better is fueled by readers like you. To enjoy unlimited access to our journalism, subscribe today.