The rules that guide health apps and activity trackers are a work in progress.
The ads, which played on the radio, in television commercials on channels from HGTV to CNN and in Google searches, were pervasive and seductive. The copy varied, but the message was always the same: brain games were the antidote for a range of cognitive issues, from garden-variety memory loss, to ADHD to Alzheimer’s.
“No matter why you want a better brain, Lumosity.com can help,” a man reassures, via voiceover, in one ad. “It’s like a personal trainer for your brain, improving your performance with the science of neuroplasticity, but in a way that just feels like games.”
While some of the company’s reported thirty-five million users bought into the fantasy, paying $14.95 for a monthly subscription for the company’s desktop and mobile games, the Federal Trade Commission was unconvinced. In January, the agency sued Lumos Labs, the maker of Lumosity, for deceptive advertising, claiming, its products’ effectiveness lacked valid scientific backing. Lumos Labs settled for $2 million, although it continues to make and market brain games.
“The settlement pertained to certain advertising language from past marketing campaigns,” Erica Perng, the company’s director of communications, said via email. “ It is important to note that this settlement does not speak to our recent marketing or the quality of our products.”
It may be a rose-tinted interpretation — the FTC’s withering critique of the company’s lack of evidence reads very much like a judgement of quality — but fundamentally, she’s right. While the agency made passing references to the Federal and Drug Administration, the federal body tasked with protecting public health, the settlement centered on Lumosity’s deceptive advertorial language.
For an emerging industry unsure of when and how the regulatory shoe would drop, it was an illuminating detail.
Technology’s power can no longer be siloed; it’s tentacles are everywhere. The influence on health care and personal wellness is already obvious, but there’s a sense we’re on the brink of an impending sea change driven by personalization, one that’s already changing how physicians treat patients.
It’s easier than ever for consumers to directly tap into the trend. Wearables are booming and smartphones are now the rule, rather than the exception. Last year 64% of American adults had one, according to a Pew survey, which means 64% of American adults owned a sophisticated health and fitness tool. By a recent count, there are more than 165,000 health apps, according to IMS Health; while the majority track simple data points, such as steps, heart rate and the number of calories burned, many go beyond this. There are apps that purport to turn a smartphone into a detection and monitoring tool for diabetes, heart conditions, depression and, yes, Alzheimer’s, among other conditions.
As the industry has blossomed, pressure has grown for regulators to manage these claims. But if the entire digital health industry is new, the surrounding regulatory framework remains a work in progress.
A test case
The rules may still be coalescing, but 23andMe provides a cautionary tale on the danger of ignoring regulators. Citing concerns about the accuracy of the results, in 2013 the Food & Drug Administration banned the startup from selling its battery of saliva-based genetic tests to consumers, shutting down its core business.
The decision was a wakeup call for 23andMe, which, in the words of founder and CEO Anne Wojcicki, had been “behind schedule” in responding to the agencies demands. The startup went into immediate damage control mode — in a statement, it painted its relationship with the FDA as “extremely important” and emphasized that it was “fully engaging with them to address their concerns.”
Since then, it is has slowly worked its way back into the FDA’s good graces. In February 2015, the agency granted the company permission to sell its direct-to-consumer test for Bloom Syndrome, a rare genetic condition. This paved the way for 23andMe’s announcement, eight months later, that it would sell direct-to-consumer “carrier status” tests, in which users can determine their risk of passing 36 genetic diseases to their children.
For other companies in the digital health space, this arch was a lesson. “It made everyone realize, ‘this is not a game, it has to be taken seriously and you need to factor working with regulators into your resource plans and your timeline,’” says Steve Krein, the CEO of StartUp Health, an incubator for digital health startups.
For 23andMe and other startups that provide “medical diagnostic advice,” working with the FDA is “a requisite for doing business,” says Bryan Roberts, a partner at the health–care and technology VC firm Venrock. That said, since its showdown with the company, the FDA has made it clear it isn’t interested in all — or even most — digital health–care startups.
Last year, the agency released its recommendations for mobile app developers, which breaks digital health apps into three buckets. In the first: Apps that are clearly not medical devices and are therefore outside of the agency’s jurisdiction. In the third: Apps that function as or control medical devices, such as a blood pressure monitor, and could pose a risk to patients’ safety if they don’t work as intended. As a broad rule, “if a physician is using it or recommending its use as part of your medical treatment, then you are going to see the FDA take a closer look,” says Morgan Reed, executive director of ACT, a trade group representing app developers.
It’s the middle bucket — reserved for apps intended “for general wellness use” i.e. the majority of consumer-facing digital apps on the market — where the agency’s involvement gets murkier. While the FDA isn’t interested in overseeing the majority of these apps, “it’s not giving up the ability to do so if something bad happens,” says Reed.
While Lumosity falls into this somewhat hazy second bucket, it’s hard to argue “brain games” pose a real threat to patients’ safety, no matter the lack of scientific effectiveness.
Where the FTC steps in
“If you claim your product does something that it doesn’t do, that’s the FTC,” says Joy Pritts, the former chief privacy officer at the U.S. Department of Health and Human Services.
The FTC has long regulated claims made by supplements, diet pills and homeopathic remedies. But as the sea of health apps has expanded, the agency “has taken a broad reading of what’s covered under its provision,” to include claims put out by digital products, says Reed. For supplements and health apps alike, the approach allows it to target companies the FDA won’t pursue.
Like Lumos Labs.
Outside the FDA’s jurisdiction, the FTC went after the company on the grounds it was making deceptive claims without the scientific evidence to back them up. More specifically, “Lumosity preyed on consumers’ fears about age-related cognitive decline, suggesting their games could stave off memory loss, dementia, and even Alzheimer’s disease,” the agency’s director of the Bureau of Consumer Protection said in a statement.
Related: The Open-Office Concept Is Dead
It’s not the first time the FTC has targeted health apps and software on the basis they were misleading consumers. Last year, it settled with two separate apps that claimed to detect melanoma. And in 2011, it fined AcneApp, an app which purported to treat acne via a light emitted from users’ iPhone screens.
As far as deceptive claims go, it doesn’t get more clear-cut than an app promising to zap away acne with a light. But Reed worries that with Lumos Labs, the FTC is on slippery ground. When initially fined, the company pointed to internal studies showing its brain games did, in fact, lead to cognitive improvement. But to the FTC, these didn’t qualify as “rigorous, scientific support.”
Which puts the FTC in a the difficult position of determining what, exactly, constitutes valid evidence and startups in the uncertain position of having to guess.
A regulatory quilt
Digital health startups, then, are regulated in something of a piecemeal fashion by the FDA, the FTC and the Health Insurance Portability and Accountability Act (HIPAA), a law designed to protect patients’ medical records. Crucially, HIPAA only applies to startups that collect or access data held by health-care providers. Most direct-to-consumer digital health companies are not covered by HIPAA — neither 23andMe nor Lumosity must comply — although there’s still widespread confusion around its reach, even among physicians.
This uncertainty can impact funding. In Reed’s experience, startups are less likely to receive VC money if they must first explain how regulations do — or don’t — apply to their startup. The intricacies of HIPAA rarely makes for a compelling elevator pitch. It can also dissuade entrepreneurs from entering the digital health industry in the first place, says Darren Dworkin, Cedars-Sinai’s chief information officer, which encourages too many “smart, innovative entrepreneurs to build the next music app.”
The Snapchat of digital health
Krein, of health-care incubator StartUp Health, doesn’t buy that digital health entrepreneurs have a more difficult time with funding. (More than $4.5 billion was invested in the space last year, according to the VC-firm Rock Health, although much of went to companies operating within the health-care system.) At this point in the digital health lifecycle, he says, investors and entrepreneurs are educated enough that the regulatory system no longer “spooks” them.
The real problem may be a lack of success stories. For traditional biotech and medical device companies, the payoff for navigating what can be a long regulatory road is already established. When a startup’s drug or device gets FDA approval, its valuation typically skyrockets. But for strictly consumer-facing products or services in the digital health space, it’s not clear the same FDA bump applies. “I can’t think of many that have gone on to make a lot of money,” says David Kim, a former partner at the life-science venture firm MPM. “If there are regulatory hurdles but you don’t see a clear carrot, what are you working towards?”
With its $1 billion valuation, 23andMe seems to be an exception. However, “that’s not going to be repeated by anyone else anytime soon,” says Kim. “That was largely Google money,” [Wojcicki’s ex-husband is Google co-founder Sergey Brin, and the company poured $7 million into her startup, much of it in crucial early rounds]. In other words, good luck raising the same amount of capital from actual VCs.
Roberts, of Venrock, isn’t interested in mail-in kits or health trackers for healthy consumers, which aren’t filling a vital need and therefore lend themselves to faddish apps and devices. Instead, he sees opportunity and money in medical products that enable the chronically sick manage their diseases from home instead of the hospital, such as smartphone-based glucose monitors for diabetics.
AliveCor believes opportunities lie somewhere in-between. The company, which makes a medical-grade mobile EKG reader used by physicians and consumers alike, is one of a growing number of startups in the so-called wearable med-tech space. “You have traditional medical device companies that design products for doctors, and then you have your Fitbits, which do an awesome job establishing fitness tracking category,” says Doug Biehn, the company’s chief commercial officer. “We’re not playing in either.”
Instead, the company sees itself in a hybrid space, a position designed to appeal to sick and healthy consumers alike. Its EKG reader — which records EKG via a thumb pad that users can attach to the back of their iPhones — is sold direct-to-consumer, but its biggest market is doctors who use it in their offices, says Biehn. And while the device is often recommended by physicians for their patients, 75% of consumers don’t have a chronic heart condition. Instead most buyers — who are typically in their 50s and 60s — are worried about developing a condition or are interested in keeping better tabs on their overall health. (The reader is able to tell users whether their heartbeat is normal or if possible atrial fibrillation is detected, a leading indicator of stroke or cardiac arrest. If they want a more complete interpretation, they can send the data to a cardiologist through the AliveCor app.)
For AliveCor, the lengthy, complex road to FDA clearance serves as a useful barrier to entry, says Biehn. The company hopes it’s a big enough hurdle to stave off competitors. This is a growing concern, as traditionally consumer-facing companies such as Jawbone and Fitbit fit have indicated they, too, are interested in the health space.
What, then, of these activity trackers and other digital health startups that operate like traditional tech startups, in that they require short lead times, small seed investments and reside firmly outside the jurisdiction of HIPAA or the FDA? “How many of these companies have made any real money?” says Kim, who believes the vast majority will be gone within the year.
For Krein, that’s the wrong question. Yes, most digital health care companies will fizzle (as will most startups). But five, 10 years down the line as the implications of this vast amount of quantified health data take root, “there will be the Facebook of personal health or wellness — rest assured.” Just because we can’t envision what it will look like — he concedes it likely won’t be a standalone activity tracker — doesn’t mean it won’t happen.
If a Facebook equivalent emerges in the health–care space, as with the actual Facebook, it’ll likely be all about personalization. As it becomes easier for consumers to collect health data, the most important advancements won’t come from devices or sensor technology, but “the machine learning that you can do on top of that data to determine patterns,” says Biehn. If mined correctly, these patterns can unearth new treatment plans, earlier diagnosis and preemptive care.
To tap into this trend, companies will need to make sure they are on the right side of regulators. Because as Lumosity and 23andMe learned the hard way, their own health depends on it.