That’s BrainEye’s claim. What does the evidence say?
It’s all in the eyes
I sat down with BrainEye’s app on Monday. I held my phone in two hands, elbows resting on my desk for stability, and then followed a small bobble with my eyes as it wandered across my screen.
This is a test of “smooth-pursuit eye movement”: the ability to keep a moving target centred on my retinas. Athletes with concussion tend to struggle with this test, their tracking slow, the target often missed.
“Almost half the neurons in the brain are involved in processing vision,” says Fielding, who is also a research fellow at Monash University’s Department of Neuroscience.
“Concussions hit the brainstem and frontal lobes particularly hard. When you smack your brain around, it’s disrupting networks. The brainstem is especially vulnerable. That’s where all the signals are generated for an eye movement.”
Victor Radley of the Roosters is attended to by trainers after a head clash during the NRL match with Brisbane Broncos.Credit: Getty Images
A concussion causes short-term effects on neural patterns and, potentially, longer-term harm to the brain tissue.
My BrainEye test informed me that I did not have a concussion, which is good news. Can it spot the red flags of concussion in athletes? That’s where it gets murkier.
Does it work?
The company has received a lot of positive press for a validation study it ran on AFL footballers, in which the tech spotted 100 per cent of footballers with concussions, and had a false-positive rate of about 15 per cent.
We should note other tests for sport-related concussion are not 100 per cent robust. A 2023 systematic review found the sensitivity of three common tools was between 50 and 88 per cent; all tools had false-positive rates of about 15 per cent.
But the study itself, funded by BrainEye and published in Sports Medicine – Open in March, has several issues. First, the sample size: 11 concussed AFL footballers. Total (plus baseline data from 384 non-concussed players).
“Such a low sample size means this must be viewed with caution,” says the Australian Institute of Sport’s Hughes, who is also lead author of the AIS Concussion and Brain Health Position Statement.
It’s also worth noting here three of the four Monash University researchers who conducted the Sports Medicine – Open study now work for BrainEye.
The study was done on players the researchers knew already had a confirmed concussion. It was unblinded. And the paper does not report confidence intervals, standard measures that tell us the level of uncertainty in the data. One statistician who read the paper did a quick back-of-the-envelope calculation: it was much lower than BrainEye’s.
“I would hope to see more data collected before these sorts of claims can be substantiated,” Swinburne University’s leading concussion researcher, adjunct professor Alan Pearce, tells me.
The method of detecting potential changes to the brain in the study – red flags for concussion – is also intriguing.
BrainEye took two measures, smooth eye tracking and “pupillary light reflex”, the quick response of the pupil to light, and combined them into an overall BrainEye score.
It then generated a cut-off value for each measure, and for the overall BrainEye score. If an athlete’s score was below the cut-off, they were assessed as concussed.
“It isn’t clear how this is calculated as it isn’t a direct average of these two outputs,” says associate professor Frances Corrigan, a concussion researcher at the University of Adelaide.
Indeed, of the concussed footballers in the study, one had smooth eye tracking above the cut-off, and four had pupil reflexes over the cut-off.
BrainEye tells me it no longer uses pupil reflexes in its app, and instead uses two measures of smooth eye tracking built on more than 150,000 completed tests. (I asked for additional clinical validation data, but it wasn’t provided.)
Then there’s the usability question. A smartphone concussion test seems like a no-brainer. But when the researchers tried to enrol AFL clubs in their study, five declined because “they found the kit and set-up too difficult and/or time-consuming to incorporate into their existing post-concussion assessment protocol”.
Of the 10 clubs that did agree to take part, only three integrated BrainEye into their concussion screening. Even then, several concussions were missed because staff “forgot” to use the device.
Why did clubs find it so tricky to use, given it’s just a smartphone?
Well, when it was tested in 2022, BrainEye wasn’t quite a smartphone. The tester version came with a custom stand and chin-rest, an LED light bar and an IR camera. Athletes had to sit on a height-adjustable chair to use it correctly.
Even with the stand, about 10 per cent of players did not manage to get the tests to return usable data, often because they were moving their heads too much.
The current version of BrainEye’s app works without a stabilising stand. So is it reasonable to still rely on test data from the stabilised version of the app, which now captures different data from the eyes?
Addressing this concern, the company provided an unpublished study titled “Clinical validation of the BrainEye Smartphone Application”.
The study tested BrainEye’s unstabilised app against two medical-grade devices: the Tobii Pro Glasses 3 (RRP $13,000 plus) and the NeurOptics NPI pupillometer. It found the three devices produced highly similar results. “Our conclusions are accurate and valid,” says Fielding.
David Hughes, of the Australian Institute of Sport, is more sceptical. BrainEye “cannot be recommended as a reliable tool for diagnosing concussion”, he says. “Further studies are needed with improved research methodology, and we also need for these studies to be done within the community sport environment.”
We should be careful, I think, about damning an Australian innovation for not having done every study it needs; BrainEye remains under development. It is not yet regulated as a diagnostic device.
But if the tech is not yet ready for prime time, what alternatives exist?
There are non-smartphone tools that already exist for non-medicos to spot concussion – the CRT6 asks fairly simple questions such as whether the athlete has blurred vision or neck pain, or feels irritable.
We could also put in place things to minimise the risk – such as banning heading in soccer training – and invest properly in training players and coaches to spot concussion at amateur level. And we can try harder to change a sporting culture that still seems to think blunt-force trauma to the head is acceptable.
Swinburne’s Alan Pearce says: “Everybody thinks that ‘tech’ will save the day, but it’s understanding the seriousness of the injury and cultural change towards concussion. It’s not just a ‘head knock’.”
Enjoyed Examine, our free weekly newsletter covering science with a sceptical, evidence-based eye? Sign up to get the whole newsletter in your inbox.
Read the full article here