No science to support supposed brain training apps.
Last year started on a $2 million sour note for brain-training game company Lumosity. That’s the amount the company, which advertised itself via assertions its products could improve memory, focus, even reverse symptoms of Alzheimer’s, agreed to pay the Federal Trade Commission over charges that these claims were unsubstantiated.
It was a watershed moment, not just for the company but for the entire industry, which sold itself on its ability to sharpen cognitive functioning and stave off mental decline. (Criticism had been brewing for awhile: In 2014, more than 70 scientists penned an open letter stating there was “little evidence” brain-games could accomplish what they purported to do.)
Since its run-in with the FTC, Lumosity’s claims have been significantly dialed back. The use to describe its online products as “a personal trainer for your brain” able to improve “performance with the science of neuroplasticity, but in a way that just feels like games.” Today, that messaging is gone, replaced by a few caveats: “We need to do more research to determine the connection between improved assessment scores and everyday tasks in participants’ lives,” the company’s website reads.
Well, more research is in and the results aren’t good for Lumosity or its competitors. The paper, published in the Journal of Neuroscience on Monday, found no evidence that playing brain games (specifically, Lumosity brain games) translated into improvements in cognitive functioning or decision making.
In the study, 64 participants played Lumosity games for 30 minutes a day for ten weeks. Another 64 played web-based video games, while a third group served as a no contact control. Before and after the ten weeks, all received brain scans and completed a cognitive exam, as well as a test designed to assess decision making and risk tolerance (for example, whether they were more likely to choose a smaller reward now or a larger reward later).
All three groups showed some improvement on cognitive measures when assessed after the ten weeks, says Dr. Caryn Lerman, the study’s lead author and a psychiatry professor at the University of Pennsylvania. Given that one group received no exposure to any kind of online game, this improvement was likely a “learning effect” (i.e. participants were more familiar with the cognitive assessment the second time around.)
Overall, the study found “no evidence that personal brain training benefited the participants in terms of improving cognitive performance, working memory, on attention, cognitive flexibility, or inhibitory control,” says Lerman. Nor did it suggest that playing brain games “altered brain activity while completing decision making measures.”
(For its part, Lumosity says “it’s a giant leap to suggest this study proves cognitive training is ‘no better than video games at improving brain function’…there remain many open questions in the field — how, why, and in what circumstances cognitive training is efficacious — and so painting in such broad strokes potentially undermines this important, ongoing research area.”)
We spoke with Lerman about the study, and whether the results surprised her.
This interview has been edited for clarity and length.
Why were you interested in studying brain games’ effectiveness?
For the past 16 years, I’ve worked in academia at the University of Pennsylvania researching why it is so difficult to change behaviors and habits that are harmful to one’s health. I was interested in testing whether cognitive exercises would be effective in improving cognitive functioning — but also whether they could be used to affect decision making processes.
I’m also an executive coach. I’m very interested in the broader ways that the average person can improve executive brain functioning.
What does the available research have to say on the topic?
The growing body of research in this space is very mixed, both in terms of the methods and the results.
In terms of the methods, there are a number of smaller scale studies, many of which did not include active control groups. Instead, they just compared cognitive training to no intervention, instead of a program that matched brain games in terms of visual stimulation, engagement, and time spent in front of a computer, all factors which could affect outcomes. And so it’s very hard to tease out whether improvements in some of these studies were due to cognitive training.
Given this somewhat murky evidence, were you surprised by the results in your study?
I was surprised. Our hypothesis was that playing brain games would have an impact on brain activity. The hope was that we could identify advantageous interventions based on the results — so it was disappointing that they were negative.
What’s the takeaway?
I wouldn’t discourage people from playing brain games if they find them enjoyable and engaging. It’s not harmful. But I would hope this help inform individuals’ expectations about brain games’ impact everyday cognitive functions.
Of course, this is one study. But it adds to the skepticism, which was already out there, in terms of the type of claims being made.