Skip to Content

Stop Freaking Out About That Study Linking Diet Soda to Alzheimer’s and Strokes

You may have come across some pretty alarming takes on diet soda going into this past weekend. “Daily dose of diet soda tied to triple risk of deadly stroke,” blared Fox News. Outlets like the Washington Post and CNN repeated the assertion that drinking artificially sweetened beverages may increase the risk that your brain just maybe headed for a blood clot or serious mental deterioration because you like to drink Diet Coke. Don’t believe the hype; the situation probably isn’t nearly as dire as that.

Some of the reports about this “deadly diet soda” study have been more nuanced than others. But there’s a common theme among a lot of them—they don’t outline some of its most crucial and relevant caveats until way past the headline. And if they did, the titles would be pretty boring. Like, “Study determines minor observational link (but no direct cause-and-effect) between certain people who drink artificial sugar beverages, but it has a small sample size that doesn’t include minorities or account for a whole bunch of other critical factors.”

Click here to subscribe to Brainstorm Health Daily, our brand new newsletter about health innovations.

That’s not exactly as sexy as claiming that a Diet Coke a day will bring Alzheimer’s in its wake, or triple the chances of a stroke. But science, fortunately (or unfortunately if you’re trying to grab clicks at the expense of good information), isn’t meant to be sexy. It’s meant to test hypotheses and express facts. And when the results of scientific experiments are presented without context, they lead to misleading, panicky headlines like the ones that dominated the Internet on Friday.

Physician Aaron Carroll, who writes for one of the most clear-eyed, if wonky, health care websites out there—the Incidental Economist—and has a delightfully no-BS, data-driven column on the New York Times’ Upshot site, highlights several reasons why you should take this new sugar study with a grain of salt.

Did the participants differ by race or ethnicity? I have no idea. I do know, however, that the authors write about the “absence of ethnic minorities, which limits the generalizability of our findings to populations of non-European decent.” Was that in the coverage you read?

Did they differ by socioeconomic status? No idea. Did they abuse drugs? Work or retire? Live alone or with someone? Have a family history of disease? No idea.

Did they acknowledge that different artificial sweeteners are different molecules with likely different effects or implications? No.

Were there multiple comparisons, meaning some results might be due to chance? Yep. Did they rely on self-report, which might mean recall bias comes into play? Yep.

Was this an observational study? Of course.

Was all of that in the coverage you read?

Carroll’s explanation is a lot more in-depth than that, digging into nerdy-but-important factors like the actual models the study’s authors used, the limitations they openly admitted to, and information we simply don’t know about their analysis.

But this does reflect a common theme in mainstream media science reporting. The drive to report the most provocative (in many cases, concerning) headlines obscure the incremental, nuanced, and decidedly not-reductive nature of good science. If you were to rely on flashy media headlines alone, you might think that everything causes cancer—or prevents it!

None of this is to say that sugar alternatives don’t come with health risks; they very well might. But limited, observational studies about public health trends can only take you so far down the path to real knowledge. So don’t feel pressured to freak out about that diet drink because the Internet told you to.