The journal Frontiers in Behavioral Neuroscience recently published a landmark study demonstrating that playing a new Cambridge University-developed “brain-training” app on an iPad for eight hours over a month improved attention and concentration.
The game, Decoder, fosters a form of attention that activates a frontal-parietal network in the brain. Wow. Let’s all get out our credit cards and order up more and more of these brain-training “games.”
Hold up. Given the glut of brain-development apps out there, it’s worth exploring their relative efficacy in more depth.
Some studies – like the recent ones used to impeach Lumosity in a major civil case – claim definitive BS: You might as well clip your toenails or binge-watch “Veep” if you want to perk up your cerebral function.
Other studies show noteworthy improvements among users in terms of brain processing speed, working memory, and executive functions.
Much research is ambivalent at best: Some sub-activities in certain apps might foster improvement, others might not; maybe they work better on the elderly than on young adults; maybe the benefits are real, but only temporary, or—
—maybe they’re entirely contingent on whether the user believes they work.
A recent study of brain-training applications didn’t go the conventional route of quantifying people’s focus/meditation/retention, etc., skills, per se, in order to make the case the games had fostered cognitive growth.
It focused, instead, on a more common – but perhaps less expected – reason people’s brains do tend to improve on given apps: The placebo effect. Meaning “treatment confidence” – even more so than the treatment itself – often acts as the real chief benefit.
In fact, for many brain-training app studies, the recruitment methods and associated biases seem to provide the bulk of the benefits.
A typical scenario: Organizers corral college students for the experiment by posting two sets of flyers around campus: One promises cognitive development—what researchers call “suggestive marketing.” The other series of ads simply invites students to “take part in a study.”
The results show that the students who are told that the games could make them smarter (or whatever) end up performing better than those who think they’re just “playing games.”
Hmm. Instructive, no?
Another study, also done with a cohort of college students, seemed to suggest the same thing. And any gain that’s to be had from these intellectual/attitude apps is rather miniscule, not to mention tough to quantify.
Not looking too good for our industry, despite the amazing results of the Cambridge Decoder study.
The aforementioned recruitment biases? A psychologist named Joseph Kable, from the University of Pennsylvania, pointed out that these studies tend to be conducted with high-functioning young people as their subjects. The thinking goes, these are folks who are savvy enough to at least get their butts into college. Which implies they could be enjoying robust cognitive health regardless. And explains why it might be so tough to see any segregable influence these app-exercises might or might not have beyond the general plasticity of the participants’ brains.
Here’s another problem with testing the veracity of brain-training app claims: Joaquin Anguera, an assistant professor of neurology at the University of San Francisco School of Medicine, points out that a single app, like Lumosity, can have several different games that trigger several different parts of the brain. So it’s hard to say decidedly whether these entire apps work as they claim to.
Our finances are limited. We have only so much time. And we certainly suffer tech fatigue. So shouldn’t we choose wisely which apps we really need, and which of those really work? Is it maybe a little impractical, even quixotic, to array yourself in all this digital armor before flying headlong into the daily joust? Perhaps. Surely there are not a few snake oil salesmen shilling their useless shit out there.
Our folks here subscribe to their share of brain tech apps, from the tried and true – Spotify – to the shiny and new—Eidetic. More than any other influence, and especially given all the potential trouble “decoding” study findings – we trust customer testimony. We find real people’s real reviews a better measure of future success, better even than the strictest scientific breakdowns.
Just keep in mind that most people who review an app will tend to fall in one of two categories: irrationally exuberant—or overdramatically disapproving. Plus the review site itself might employ some kind of filters or algorithms that skew results—typically reverse-belling curves away from the mid-ranges.
We like those apps that provide either a free version—or a free trial period during which you can judge their efficacy for yourself, investing only time and energy only—not money.
For example, do you feel, after a few guided meditations from Headspace, that your boss’s passive aggressive quirks are a wee bit easier to stomach? Mazel tov! Your meditation app worked!
Will people in lab coats look at your CAT scan and marvel at some significant shift in circuitry? Probably not.
In the end, treatment confidence does not equate to a sham. Especially when it works in tandem with proven science. We give you the tools to focus and the confidence you can better focus, and guess what’s going to happen?