Decoding the business of brain–computer interfaces

Nature Electronics

Fifty years after the term brain–computer interface was coined, the neurotechnology is being pursued by an array of start-up companies using a variety of different technologies. But the path to clinical and commercial success remains uncertain. By Liam Drew

 

In 1973, Jacques Vidal, a computer scientist at the University of California, Los Angeles, wrote a paper in which he considered the fact that electrodes placed on a person’s scalp could detect real-time signals emanating from that individual’s brain1. Electroencephalography (EEG) produced complex and often difficult to interpret signals, Vidal wrote. But he predicted that as neuroscientists came to better understand this read out of neural activity — to decipher components of it that correspond to aspects of someone’s feelings, thoughts or wants — then an EEG might be used as a control signal for a computer.

Vidal coined the term brain–computer interface (BCI), and asked, “Can these observable electrical brain signals be put to work as carriers of information in man-computer communication or for the purpose of controlling such external apparatus as prosthetic devices or spaceships?”

In the second part of his paper, Vidal described pilot experiments in which he’d attempted to have volunteers use their EEG to fire missiles at spaceships in a computer game.

Half a century later, BCIs are big business. The advances in “neurophysiology […], signal analysis techniques […] and in computer science,” that Vidal said were needed to make this technology feasible have, to varying extents, materialized. And more than US$1 billion has poured into this sector in recent years, funding numerous start-ups.

While the head of the world’s best-funded neurotechnology start-up — Elon Musk — also owns a spaceship company, it is the control of prosthetic devices that has dominated academic research on BCIs. And the mission of several companies today is to develop assistive technology that restores movement and/or communication capabilities to people who have lost these through paralysis-causing injuries or diseases. Gaming also represents a target market for commercial companies who are asking how the broad potential of devices that directly detect users’ brain activity might become a mass market product. In short, what’s the killer app for a BCI?

Choose your signal

Where Vidal’s vision fell short was his focus on EEG as an input for BCIs. Despite intensive efforts to better decode this signal, it remains a very coarse readout of neural activity. “The hard line is physics. If you sit outside of the skin, outside of the skull, the kind of signals that you have access to have much, much lower signal to noise than if you are in tissue,” say Matt Angle, founder and CEO of Paradromics, a BCI company based in Austin, Texas.

Consequently, as BCI research evolved in academic labs through the 1990s, it became clear that to obtain high-resolution brain readouts capable of controlling external devices in nuanced ways, electrodes would have to be implanted at least directly on the brain, if not actually within neural tissue.

“It is night and day in terms of the signal fidelity, invasive versus non-invasive,” says Vikash Gilja, a computer scientist working on BCIs at the University of California, San Diego. But he explains that what is most important is getting the appropriate signal for the application you’re developing.

In Gilja’s view, an EEG signal — attenuated by the bony cranium and reflecting the averaged activity of millions of neurons spanning multiple square-centimetres of cortex — is potentially good for real-time monitoring of “a global, general brain state that is transitioning slowly.” Examples include wakefulness, attention, or meditative states, which are accompanied by population-level changes in neuronal firing.

EEG might also be used for controlling external devices with a time lag of tens to hundreds of milliseconds, Gilja says, but “we’re likely talking about very low bit rate outputs, so things that might be binary switches.”

Ben Rapoport, a neurosurgeon who heads Precision Neuroscience in New York City agrees. He says the term BCI “is synonymous with high bandwidth. Really smooth communication between the brain and the outside world requires a high-bandwidth connection. And there’s not a way to achieve high-bandwidth connection without touching the brain.”

But the obvious and considerable advantage of an EEG-based BCI is that it does not require its user to have electrodes surgically implanted in their cranium. And numerous developers are currently working on applications of such systems. What they must do is to develop ways of detecting reliable low-resolution signals and develop effective ways of utilizing them — be that for clinical use or mass market adoption.

The academic research that demonstrated the value of intracranial recordings has largely focused on the clinical goal of allowing people with severe paralysis to control external devices such as robotic arms or wheelchairs and to generate synthetic speech solely by thinking.

Much of this research has concentrated on the motor cortex, the brain area that generates the signals that command the body’s musculature. Hence, robotic arms are, for example, controlled by someone imagining moving their own arm. And thought-to-text applications may involve someone imagining themself handwriting or moving their limbs left or right to move a cursor.

The several companies now developing invasive BCI systems for clinical use can be divided according to those using electrodes that record from the surface of the brain and those whose electrodes penetrate neural tis-sue to record from individual neurons.

Surface electrodes perform electrocorticography (ECoG). Like EEG, ECoG signals reflect the averaged activity of many neurons — but, in this case, thousands or tens of thousands of them. Plus, its spatial resolution is square millimetres of cortex, rather than square centimetres, and there is much less signal attenuation.

Users of ECoG can point to several landmark BCI research papers indicating that this signal can be used for sophisticated device control. Yet advocates of single-neuron recordings say that by obtaining essentially digital readouts of the discrete electrical pulses by which neurons process information represents a step change in the type of readout.

What’s more, says Angle, with each additional neuron that a BCI records from in a given brain region, there is an almost-linear increase in the amount of information this provides about the functioning of that brain region. “When you think about doing more and more applications and opening up the field of what interfaces can do,” he says, “if you’re working with penetrating electrodes that record action potentials, then those scaling laws are meaningful, because the more electrodes you can get in, the more data you have.”

The challenge for implanted-device developers is creating recording systems that can be safely inserted and safely kept in place, to acquire stable recordings for a suitable period, which in many cases will be years if not decades. And this is a source of much engineering research.

Non-invasive BCIs

For all of EEG’s limitations, inexpensive systems for recording this signal are now widely available and numerous companies are today bidding to develop products that use this signal.

Luca Tonin, a BCI researcher at the University of Padova, has, for example, shown in academic studies that left/right decisions can be reliably made using EEG readouts. But, he highlights that commercial EEG products are even further restricted by the fact that no company wants a product that requires the normal — messy and laboriously applied — conducting gel that improves signal quality.

Indeed, one standout feature of these tech companies is their commitment to developing aesthetically pleasing devices that people will happily wear.

Among such companies, there is currently an emphasis on what Gilja describes as global brain states: emerging products being relatively simple EEG systems that monitor overall brain activity in ways that allow users to better control or respond to that activity.

As such, these devices do not fit the classical view of BCIs: that is, they do not enable volitional control of external computers. Instead, they make users consciously aware of their own brain state, placing them in an ‘open loop’ with their brain activity, so that they might adjust their behaviour and thoughts accordingly or have their brain activity control some aspect of their computing experience.

Examples include Neurable, a Boston, Massachusetts-based start-up, which has incorporated 16 ‘invisible’ EEG electrodes into a pair of noise-cancelling Bluetooth headphones; NextSense of Mountain View California, who are developing earbuds that contain EEG sensors; Toronto, Canada-based InteraXon, and their product Muse, which offers a headband that clips over the ears and runs around the forehead to monitor the EEG above the frontal lobes; Neurosity, of Brooklyn, New York, who have created a slick curving headband device that they call the Crown; and Emotiv, whose various EEG headsets and earbuds provide different EEG monitoring capabilites.

Neurable and Neurosity aim to increase users’ productivity and/or focus by monitoring brain waves associated with an attentive state. Whereas Muse wish to help users improve their mindfulness by measuring activity patterns associated with deep meditative states. And NextSense are looking to develop software that monitors brain health.

Other companies are, however, aiming to allow users to control external devices. One is Paris-based company NextMind, who developed a lightweight EEG sensor that fits around the back of a user’s head, so that its electrodes lie above the visual cortex. Designed to detect electrical patterns associated with users focusing on elements of a laptop display, it enables users to play rudimentary games and to select items on screen. In March 2022, social media company Snap (the developers of SnapChat) acquired NextMind to further develop and operationalize this technology.

Also using EEG electrodes that sit above the visual cortex is Cognixion, a Santa Barbara-based start-up. Founded by Andreas Forsland in 2014, Cognixion’s ambition was to help people who have various forms of physical incapacity to communicate.

Over the years, the company investigated several types of sensor, including accelerometers and eye-tracking devices, aimed at allowing users to deploy their remaining movement capabilities to control external devices. But an EEG BCI is now central to the assistive technology headset the company is poised to release.

That headset — the Cognixion ONE — centres on an augmented reality, or ‘assistive reality’, visual interface displayed on a visor before the user’s eyes. This displays multiple options that users can select from by staring at a target item. And the company has deployed a clever strategy to determine which item users are looking at.

Each target option flickers at a different frequency — and this flickering influences the nature of the visual cortical EEG signal evoked by looking at it. After a period of calibration, machine learning software can then distinguish — over a second or three — which item a user is selecting.

“We’re writing to the brain with frequencies,” says Forsland, “and we’re hunting for those signals that are being injected into the brain.”

Forsland says that Cognixion plans to release its headset to clinicians and researchers next year and that it is in discussions with the US Food and Drug Administration (FDA) to have the device clinically approved for people dealing with various forms of paralysis. He also notes that the assistive reality interface is compatible with other types of BCI.

A company that already has FDA approval for its non-invasive BCI is Neurolutions of Santa Cruz, California. Its Ipsihand system is designed to help people who have lost the use of a hand following a stroke on one side of their brain to regain control of that hand.

Their approach, explains senior director of engineering, Rob Coker, is based on the fact that while each of our hands is primarily controlled by the motor cortex on the contralateral side of the brain, a small fraction of the motor cortex on the same side as each hand is capable of controlling that hand. The Ipsihand system aims to induce or accelerate forms of neuroplasticity that will increase the ability of the ipsilateral motor cortex to control the stroke-affected hand.

Approved by the FDA in April 2021, the BCI records EEG signals from above the uninjured ipsilateral motor cortex and couples these to a robotic hand piece that surrounds and moves the paralysed hand. “As soon as the patient is thinking, ‘I really want to move my impaired hand’,” says director of clinical affairs, Lauren Souders, “it picks up on the signal and opens the device fairly quickly.”

Previous therapeutic devices have used electrical stimulation to encourage neuroplasticity, she says, but this system couples movement intention to actual movement, providing a much more powerful stimulus for learning and plasticity. In a clinical trial, two-thirds of users achieved clinically significant improvements after six months2,3. For instance, says Souders, “They’re able to grasp something and carry it somewhere.”

One interesting addition to the non-invasive BCI space is a system developed by Los Angelesbased neurotech company Kernel. Founded by entrepreneur Bryan Johnson in 2016, and inspired in part by Johnson’s own depression and his frustrations at how little was understood of the brain, Kernel sought to develop a system that provided high-quality information about the workings of the brain and mind, says chief technical officer, Ryan Field.

Initially, the company was interested in implanted recording technologies, but Field says, Johnson pivoted the company and said, ‘We’re gonna go non-invasive, because non-invasive is the only path to really bringing neurotechnology to the masses.’

Kernel opted against EEG and tried two other approaches. One was magnetoencephalography (MEG), the other functional near-infrared spectroscopy (fNIRS) — the technology that an Apple Watch uses to measure blood oxygenation.

Field thinks MEG, which measures magnetic fields created and emitted by neural activity, yields a more spatially precise signal than EEG. But the company failed to create a wearable device that sufficiently shielded the recording system from ambient magnetic signals and canned the project last year.

By contrast, they now have a prototype product that uses fNIRS to monitor brain activity. This technology uses the same indicator of brain activity that functional magnetic resonance imaging relies on: localized increases in blood flow induced by a rise in neural activity. “Your neurons start to consume oxygen, and then your body tries to replenish it with fresh, oxygenated blood,” Field says.

To image the cerebral blood flow (as opposed to, say, blood flowing near the surface of the wrist), Kernel uses time-domain-fNIRS (TD-fNIRS); its system firing photons into the skull, then analysing only those that return in a time window consistent with them having entered and returned from the brain. The intensity of specifically these signals then indicates local blood flow.

The helmet-like Kernel Flow system4 contains 52 TD-fNIRS sensor modules, which cover the entire cranium to monitor the whole cortex. But Field says future systems could contain fewer sensors targeted at cortical regions of interest.

There is, however, a catch. “The haemodynamic signal takes typically on the order of a few seconds to evolve,” says Field. “It’s a lagging signal.”

Such a delay means that this device is unlikely to be useful for the timely control of external devices, but it does offer a new wear-able and spatially precise way of monitoring individuals’ brain activity. “We’re really looking at brain states,” says Field.

As to where it might get its market uptake, Field is frank: “We don’t know yet”, he says, explaining that the company’s philosophy was to first build a new effective technology, then to give it to “explorers” to figure out what it is good for. Kernel, therefore, plans to get the Flow system into the hands of clinicians and researchers in 2023, to let them try out ways of using it — potentially to diagnose brain disorders and to monitor their treatment or to look at drug or meditation effects on brain activity. “How’s it going to help the neurotech market take off? I think that’s one of the big questions we have to answer for 2023,” says Field.

Implanted ECoG BCIs

Of the companies developing BCIs with implanted recording systems, two are exclusively pursuing ECoG devices: Synchron and Precision Neuroscience.

ECoG has been used clinically for decades, in particular to monitor epileptic activity. And indeed, one company — not always grouped with BCI developers — has already introduced a chronically implanted, closed-loop, deep-brain stimulation system into the clinic for intractable epilepsy. The system developed by Neuropace of Mountain View, California, uses ECoG to detect signs of imminent epileptiform activity. When it does, it automatically triggers a stimulating electrode, implanted near the source of this activity, to deliver high-frequency current pulses that decrease the likelihood of a full seizure5.

With ECoG also having been used in several landmark academic BCI proof-of-principle studies for controlling external devices, Synchron and Precision Neuroscience have developed ingenious ways of implanting novel recording interfaces.

Based in Brooklyn, New York, Synchron was founded in 2012 (as SmartStent, becoming Synchron in 2016) by a neurologist and a cardiologist, who combined their expertise to develop ‘stentrodes’. Inspired by stents — the mechanical devices that hold blood vessels open — a stentrode is a tubular metallic mesh of electrodes that, inserted into a cerebral blood vessel, will adhere to that vessel’s sides and record the population activity of neurons in the adjacent brain tissue.

Synchron takes advantage of the fact that the human motor cortex is home to a major blood vessel and target this area so that a person’s imagined movements can be used to control a computer interface. Although ECoG does not offer the resolution of single-neuron recordings, CEO Tom Oxley (the neurologist founder) says, “when I move my hands right now, there’s very predictable patterns going on that can be recorded by ECoG.” The system is connected by a rechargeable battery implanted in the chest and transmits data wirelessly. The system’s analysis software then classifies signals as yes or no command outputs, which instruct a control system on an Apple iPhone.

“We’re trying to keep it as simple as possible,” Oxley says. “The basic functionality of the system is to enable the patient to control devices; to achieve tasks that improve their functional independence; so, things like e-mailing, texting, online tasks, digital healthcare. ”

Also relatively simple — compared with neurosurgery — is the insertion procedure, which is done as an outpatient procedure, with the device remaining in place indefinitely. So far, seven people living with severe paralysis have had Synchron’s stentrodes implanted with positive results6. The company, says Oxley, is in discussions with the FDA about a path to clinical approval. “It is still several years away,” he says, “but we are moving swiftly towards a pivotal trial.”

Precision Neuroscience — founded in 2021 by Rapoport and colleagues — takes a more conventional approach to acquiring ECoG recordings by placing electrodes on the cortical surface. Its innovation comes from both its thin-film electrode array — which contains 1,024 electrodes and can flex to conform to the curvature of the brain — and the minimally invasive surgery it uses to implant them.

“We make a tiny incision in the skin, and also a very small incision in the bone,” says Rapoport. “Then, we slide the electrode through that bone incision onto the brain surface. That’s a minimally invasive procedure that we developed here at Precision to minimize the footprint of the whole procedure on the patient.” Data are collected in a hub that is implanted subcutaneously on the cranium — and will be powered either by a small battery there or wired to a battery in the chest.

“The ultimate product will be a completely implanted system, no wires to the outside world,” says Rapoport. “And it will allow users to have direct brain communication with the digital world.”

A key feature of Precision’s electrode array is that it should cause no harm to the brain, meaning it can be moved to where it detects the most robust signals and also removed, which would enable upgrades to replace older electrodes if needed. Minimal surgery and damage-free recording are at the heart of Precision’s philosophy, explains Rapoport.

So far, the device has been tested in mini-pigs, and Precision are talking with the FDA about initial clinical trials this year. Such trials will involve, at first, using the system for short periods of time to monitor brain activity in people who need this, such as those with epilepsy undergoing medical examinations.

Single neurons

Today, three companies lead the way in developing BCIs that record from individual neurons. One is Paradromics; one is Musk’s Neuralink, which is based in Fremont, California; and the other is Blackrock Neurotech of Salt Lake City, Utah, a company that has been a mainstay of BCI research since its founding in 2008, when it picked up foundational implant technology that had been developed since the 1990s.

Blackrock’s so-called Utah array is a 10 × 10 grid of protruding platinum and silicon electrodes — 1 mm or 1.5 mm long and 80 µm in diameter at the top, tapering to a fine point — that penetrate the cortical surface. It was the first BCI to be implanted in a human in 2005. And since then, more than 30 people have received such implants — sometimes two or three — as part of ongoing clinical trials in which users have controlled various external devices from software to robotic arms7.

Typically, an individual array picks up the firing of dozens of neurons. Each electrode can also pass current to stimulate the brain very locally. This capability underpins another aspect of BCI technology — the possibility of ‘writing’ information into the brain. Early studies have shown, for example, that stimulating the somatosensory cortex of the brain can evoke a rudimentary, synthetic sense of touch.

Blackrock’s co-founder and chairman, Flo-rian Solzbacher, says the company is hoping to soon commercialize their system. Initially, it is seeking to use its longstanding system in clinical trials, albeit this is an interface that connects to external devices via a wire that plugs into a transcranial port. Although for someone with whole-body paralysis such a connection maybe a minor burden, the port is an infection risk, and Solzbacher says the company is working on a fully implanted wireless system.

“We want to get things out there as quickly as possible,” says Solzbacher, “to make the wired device available to that subset of patients to whom this would be useful and for whom the risk–benefit would be okay, so patients with ALS [amyotrophic lateral sclerosis] or with severe tetraplegia.”

Paradromics, founded in 2015 by Angle and colleagues, is also developing arrays with protruding electrodes. But Angle says they have done several things differently to Blackrock, such as using a 20 × 20 array to increase bandwidth and using different materials that Angle thinks will be more durable. The system is also designed to make it easier to implant multiple arrays. “The final thing that distinguishes our device from the Utah array,” Angle says, “is that we are using active electronics in the implantable intracortical module. That means that you can have as many electrodes as you want, and still only have a few wires coming out to connect to your transceiver and telemetry.”

The arrays have already been tested in sheep and Angle says that the company hopes to test what will be its final full system in these animals in 2023 as a last step to getting FDA clearance for a first-in-human trial.

Also now in discussions with the FDA regarding human trials — according to a December 2022 press conference — is Neuralink. Musk’s company is widely credited with accelerating the BCI sector. Its founding in 2016, says Solzbacher “raised awareness. The financial markets started taking notice and so we started seeing companies being created.” Whereas Rapoport, a Neuralink co-founder who worked there for 15 months, says it “created a community of people who had a chance to work with each other in an amazing kind of capital unconstrained manner […], we explored a lot of different options and learned a lot about what was really possible.”

The technology that Neuralink ultimately chose is called ‘threads’: fine, flexible polymer electrodes designed to match the mechanical impedance of the brain, allowing them to be threaded through neural tissue and to remain there indefinitely picking up signals from hundreds, if not thousands, of neurons. The system has been trialled in monkeys so far, replicating previous BCI studies in which these animals control external software.

Making implanted BCIs work isn’t just about developing good electrodes though. Experts agree that many key academic advances have relied on the emergence of machine learning techniques that identify patterns in neural activity that correlate with what the user is thinking during training sessions. But questions remain about the reliability and scalability of these calibration systems in real-world settings where individual users may vary in unpredictable ways according to idiosyncrasies of their brain function or the nature of their brain injury.

Nevertheless, Angle doesn’t believe software is a major bottleneck in development. “There are going to be proprietary aspects to the user interface,” he says. “But at the same time, they all kind of do the same stuff — that’s not where the secret sauce is. The most important thing is to get high-quality signals.”

Indeed, when it comes to potential upcoming hurdles, Gilja highlights two other issues. First, there are the system components needed to transmit data from the brain. “Amplification, digitization doesn’t come for free,” he says, explaining that these components take up space, which is at a premium within the cranium, plus, he notes, “it takes energy, which equals heat.” These issues become greater the higher the amount of data being collected, and heat generation especially will need to be carefully assessed if it occurs close to the brain. “It’s a multifactorial challenge to build systems that allow for efficient data transport in and out of the body,” says Gilja.

Second, Gilja says, that the innovations in the materials being used to record neural activity create certain unknowns — especially because for many of the envisaged applications, implanted BCIs will need to function well for decades. “They’re using materials that we have less experience with. They’re more nascent technologies,” he says. “So there just isn’t as much collective experience and knowledge on how they will behave over much longer time periods. We won’t fully know about longevity till the years pass.”

What comes next?

The year 2023 promises to be significant for BCIs. Multiple non-invasive systems are likely to enter the hands of researchers and clinicians, and several key clinical trials of implanted systems are scheduled to begin. To succeed, BCIs must establish themselves as safe and truly helpful to people with serious medical conditions.

In the commercial sector, developers of devices aimed at mass uptake must convince people to part with their money and to embrace new ways of either knowing their own brains or interacting with their computers.

Nevertheless, many are bullish. Field says Kernel remains committed to their ambition of having a Flow headset in every home in the world by 2033. And Rapoport believes that in the implanted-device space, at least, a certain camaraderie is fuelling the sector. “We all have joint incentives to see the technology move forward,” he says, “and to see regulators and insurers and doctors and hospitals and the whole medical system and the public embrace the new generation of brain–computer interfaces.”

Timelines today are ambitious and almost certainly debatable. But one thing Vidal wrote of BCIs in 1973 is as true today as it was then: “the long-range implications of systems of that type can only be speculated upon at present.”

 

References

1. Vidal, J. J. Annu. Rev. Biophys. Bioeng. 2, 157–180 (1973).

2. Bundy, B. T. et al. Stroke 48, 1908–1915 (2017).

3. Rustamov, N., Humphries, J., Carter, A. & Leuthardt, E. C. Brain Commun. 4, fcac136 (2022).

4. Ban, H. Y. et al. J. Biomed. Opt. 27, 074710 (2022).

5. Bergey, G. K. et al. Neurology 84, 810–817 (2015).

6. Mitchell, P. et al. JAMA Neurol. https://doi.org/10.1001/ jamaneurol.2022.4847 (2023).

7. Willett, F. R. et al. Nature 593, 249–254 (2021).