Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

One sure thing you’ll learn from ‘b

By Sarah Kaplan Washington Post

Spend enough time playing “brain-training” games, and you’ll get pretty good at games. But you won’t necessarily get better at anything else.

That’s the conclusion of an extensive review published in the journal Psychological Science in the Public Interest this week. A team of psychologists scoured the scientific literature for studies held up by brain-training proponents as evidence that the technique works – and found the research wanting.

Training tools enhanced performance on the tasks that they tested, which makes sense: Spend enough time matching colored cards or memorizing strings of letters, and you’ll start to get really good at matching colors and memorizing letters. But there is “little evidence that training enhances performance on distantly related tasks or that training improves everyday cognitive performance,” the authors write. They also argue that the studies used to promote brain-training tools had major problems with their design or analysis that make it impossible to draw any general conclusions from them.

“It’s disappointing that the evidence isn’t stronger,” Daniel Simons, an author of the article and a psychology professor at the University of Illinois at Urbana-Champaign, told NPR. “It would be really nice if you could play some games and have it radically change your cognitive abilities. But the studies don’t show that on objectively measured real-world outcomes.”

Brain-training programs have been controversial for years. Starting in the mid-2000s, a number of experiments suggested that astonishing cognitive improvements could be induced by simple training-game interventions. One of the most high-profile studies, published in the Proceedings of the National Academy of Sciences in 2008, found that about four weeks of brain training dramatically improved young adults’ ability to solve problems they had never encountered before. The big claim was that the technique could produce “vertical transfer” of cognitive skills – in other words, playing games would boost the brain’s ability to do more sophisticated tasks.

But other researchers have had trouble reproducing this work. In 2014, a coalition of 70 scientists published an open letter on the website of the Stanford Center on Longevity questioning whether there was any scientific evidence that training games actually improve general cognitive function.

More than 100 brain-training proponents responded with an open letter of their own on the website Cognitive Training Data. They argued that, though more research is needed, there is evidence that cognitive-training regimens work, and they listed 132 studies to back up their claims. Some of those studies were the same ones that skeptics of brain training cited to cast doubt on the technique.

“How could two teams of scientists examine the same literature and come to conflicting ‘consensus’ views about the effectiveness of brain training?” Simons and his colleagues write. Hoping to bring some clarity to the debate, they analyzed all 132 studies and tried to apply some objective standards.

To definitively prove that brain-training results in vertical transfer, studies should have a good control group, one that was assigned a comparable task to brain training to prove that it is really the specific technique that led to the improvements. They should test a large number of participants to weed out the possibility of results that are a statistical fluke. And they should account for expectations and biases – people who play training games are likely to expect to become smarter, much as people who take a placebo pill expect to feel better.

According to Simons and his colleagues, nearly every one of the brain-training studies they looked at failed to meet these standards. The studies that subscribed to psychology’s best practices suggested that brain training made participants better at the specific task being tested but did not lead to any generalized improvements.

The review was lauded by several of the 70 scientists who co-signed the 2014 letter casting doubt on brain training and was, perhaps predictably, criticized by advocates of the technique. Speaking to the Atlantic, Henry Mahncke, a neuroscientist and chief executive of the training company Posit Science, accused Simons’ team of bias: “They twisted every one of those studies to fit their theories that cognitive training can’t work,” he said.

But brain-training programs have already suffered a financial blow. In January, the Federal Trade Commission announced that the creators of the brain-training game Lumosity would pay $2 million to settle charges of deceptive advertising. The company has since scaled back its claims – instead of promoting the game as a way to get smarter, Lumosity’s website describes it simply as “designed by scientists to challenge core cognitive abilities.”

Training games may someday prove to boost the brain, Simons and his colleagues write. But they haven’t yet.