The Pro Bowl quarterback, who personally visited the lab in June 2015, said that NeuroTracker had improved his spatial awareness, the type of vision needed to scan the field for open receivers.
“That’s key as a quarterback, to be able to see things and how they relate to each other really quickly,” Ryan said. “I think that’s exactly what NeuroTracker helps you do.”
He considered it as crucial to his performance as lifting weights or training agility.
“We spend a lot of time working on our bodies,” Ryan said. “It’s equally important to have your mind operating on a high level.”
In fact, with a professional setup costing $6,000, NeuroTracker boasts that in only 12 training sessions lasting five minutes, athletes will begin to notice sustainable benefits to their on-field performance.
“They will see it in their play,” said Castonguay, who is the lead investor in the company.
These supposed benefits have varied. One of the earliest customers was Zaichkowsky, a longtime sports psychologist at Boston University who was hired by the Vancouver Canucks to introduce a sports science program in July 2010. He persuaded the front office to set up a NeuroTracker cave in the practice facility, at a cost then of around $40,000.
“Conceptually and scientifically, it made a lot of sense,” said Zaichkowsky, now a member of CogniSens’ science board.
Matt Fast, a professional golfer, was hoping to improve a different type of sustained attention when he began using NeuroTracker on the Web.com tour in 2014. He felt that when he played the chaotic ball-tracking game in the morning, he putted significantly better later in the day.
“Everything would be blacked out except for what your eyes were looking at,” Fast said. “The times that I’ve putted the best were when I was using it. I totally believe in it.”
Some were intrigued by NeuroTracker’s potential for data collection and as a scouting tool, including Jared Micklos, the director of the U.S. Soccer Development Academy. In the fall of 2014, he signed a three-year agreement with the company to test NeuroTracker on youth players at the Development Academy combine in June and December. He said the data was beginning to generate some interesting revelations about differences among positions.
Now they can begin to track how players progressed from age 14 to 17, particularly those who have gone on to higher levels, like the national team.
“Is there a correlation of the score?” Micklos said. “Are these high scores? If that’s the case, then, yeah, continuing to test might make sense.”
Caroline Calvé, a Canadian snowboarder, said that if she had not retired last year she would still be using NeuroTracker. Before the 2014 Winter Games in Sochi, Russia, Calvé started using the program along with other mental-training products to help her concentration, which she said was a mess during the 2010 Games in Vancouver.
“It gave me the time to practice that focus,” Calvé said, adding: “It is tough to measure. It’s tough to say, ‘Is that why I did better? Is it because suddenly I felt I had more control so I gained more confidence?’ It’s so hard to tell.”
Whatever the motivation, however, Calvé could not argue with the results. She jumped from No. 20 in 2010 to No. 6 in the parallel giant slalom in 2014.
“The fact that I was practicing that focus was something that, yes, I feel helped me,” she said.
The Myth of the Cave?
When Nintendo started Brain Age in 2005 — a video game built around puzzles, testing nothing as it relates to Mario or Luigi, but rather the speed at which players could complete mental agility tasks like speed-counting, word memory and Sudoku — it was advertised as a method for “flexing your mental muscles.” It was based largely on the work of a renowned Japanese neuroscientist, and it sold well. A commercial race to bring cognitive-training applications into the mainstream ensued. By 2013, Lumosity, one of the most successful contenders, had reached 50 million members on the basis that playing its games could increase work or school performance.
But the notion that practice at one task could effectively bolster abilities in another — called transfer — has long been disputed. In 1906, the psychologist Edward L. Thorndike found that rigorous practice helped students’ ability to estimate the areas of rectangles, but it did not help them estimate the areas of other shapes. Another century’s worth of research has continued to reshape and redefine, expand and restrict this line of thinking. Sir Charles Sherrington, the Nobel Prize-winning neurophysiologist, also believed that “acquisition of a habit is not transferable beyond its application.” K. Anders Ericsson’s seminal study on expertise — the so-called 10,000 hours rule — emphasized training that was specific to the skill. This meant that memorizing an extraordinary amount of numbers is not likely to yield improvement in your recall of names. Practicing Tetris can improve your ability to play Tetris, but it probably won’t help you get better at juggling.
With NeuroTracker, Faubert has made some effort to rebut this, but not everybody has been persuaded. Mr. Williams, at the University of Utah, challenged the notion that tracking bouncing objects in a simulation could train or quantify anything other than a person’s ability to track bouncing objects in a simulation.
“I’ve never seen a soccer player chasing multicolor balloons around on the field,” Williams said. “It’s just not what soccer players do.”
What soccer players do, he said, is read patterns of play, anticipate what might happen next based on movements of teammates and opponents, and identify familiar sequences as they unfold. This “inside” knowledge, built up over time, promotes the effectiveness and efficiency that Ericsson argues are the hallmarks of expertise.
Williams is not alone in his skepticism. In an exhaustive review published in the October issue of the journal Psychological Science in the Public Interest, researchers criticized the sweeping claims made throughout the $1.3 billion brain-gaming industry related to “cognitive improvement.” The researchers concluded that the evidence was “limited and inconsistent” that commercial brain-training software could enhance cognition outside the laboratory in the ways the companies described. (Last January, Lumos Labs, the maker of Lumosity, settled Federal Trade Commission charges of deceptive advertising for $2 million.)
CogniSens was one of 30 companies that the lead author, Daniel J. Simons, used to support the study’s analysis, naming it as one of a dozen companies that cited no peer-reviewed evidence from intervention studies. Castonguay said he emailed Simons to explain that they had, in fact, published multiple studies supporting NeuroTracker’s work, but did not provide links on their website. (The website now lists 12 supporting papers and studies.)
Continue reading the main story