News
Architecture and Hardware

Brain Implants Give People Back What They Lost

Brain-computer interfaces are helping to restore movement, vision, and speech.

Posted

Ian Burkhart was a 19-year-old college student enjoying a day out with friends in 2010 when he dove into the water off North Carolina’s Outer Bank, hit bottom, and broke his neck. He wound up paralyzed below the elbows, unable to walk or to control his wrists or fingers. The accident did not end his story, though, because just four years later he became the first person to undergo a procedure aimed at restoring movement to his hands.

Researchers at The Ohio State University opened his skull and implanted an array of 96 electrodes into his brain. The electrodes recorded the neural activity that occurred when Burkhart imagined moving his hand, and sent that information along wires to a computer outside his head. A machine learning algorithm decoded the neuronal activity and passed on the information to a device on his forearm, which used electrical signals to stimulate muscle movement, essentially bypassing the broken lines of communication in his spine. With less than 10 hours of training, “I was able to start getting good control over the arm and feeling like it was me that was operating in it versus something foreign,” he said.

Although the trial originally was scheduled to last 18 months, Burkhart continued to work with the researchers for seven years. At that point, suffering from a slight skin infection around the extruding wires and with the research at a natural stopping point, he chose to have the electrodes removed. Under the right circumstance, though, he would happily get a new implant, he said.

Ian Burkhart
Ian Burkhart, who became paraplegic in a 2010 diving accident, uses a brain-computer interface to control muscle simulation that restores movement to his fingers and wrists.

While he was the first person to have electrodes implanted in his brain to restore function to his own body, others have had similar brain-computer interfaces (BCIs) designed to let them operate a computer or control a prosthetic limb. Their numbers are slowly growing, as academic researchers continue to work on decoding signals from the brain, and private companies expand their efforts to help people with paralysis, amyotrophic lateral sclerosis (ALS) or other conditions that rob them of the ability to interact with the outside world.

In January 2024, for instance, Neuralink, a company co-founded by Elon Musk, implanted its BCI into its first patient, 30-year-old quadriplegic Noland Arbaugh. The Neuralink device consists of 1,024 electrodes distributed across 64 micrometer-sized threads. The end of the device is sealed inside a biocompatible enclosure and powered by a battery that can be recharged by induction from outside the skull, eliminating extruding wires. The threads, which penetrate three to four millimeters into the brain, are so fine that they must be inserted by a surgical robot.

Shortly after the surgery, however, the threads, which researchers had expected would be held in place by scar tissue, started coming loose from Arbaugh’s brain. Eventually 85% of the threads lost their connection. Even so, changes to the algorithms allowed Arbaugh to keep using the BCI to control a computer cursor and play video games.

To keep the threads from pulling loose, the company said it would insert them to a variety of greater depths on the next patients, up to about 8 mm. Musk announced in early August that Neuralink had implanted a second spinal cord injury patient, and that 400 of the electrodes in the device were working. The threads were not pulling loose this time, the company said in an update. Musk said he hoped to implant eight more people in 2024.

Improving interfaces

Of course, all the signal processing and machine learning algorithms that interpret the brain’s activity need data to function, and the workhorse for collecting those signals has been the Utah array, manufactured by Blackrock Neurotech (https://blackrockneurotech.com/), which consists of 96 electrodes. Researchers are exploring alternatives, however. While many groups implant electrodes, some, such as Precision Neuroscience (https://precisionneuro.io/), place them on the surface of the brain. Another approach, taken by a company called Synchron (https://synchron.com/), is to insert so-called ‘stentrodes’ through blood vessels, which eliminates the need to cut through the skull. While less-invasive devices may trigger less inflammation in the brain, they do not record as much signal as those inserted into the tissue.

Blackrock has developed a new interface it calls Neuralace, a thin, flexible, mesh-like chip designed to conform to the uneven surface of the brain and to provide access to more than 10,000 recording channels. The number of neurons from which such devices can record is only going to grow, said Florian Solzbacher, a professor of electrical and computer engineering at the University of Utah and co-founder of Blackrock. “There will be an acceleration in terms of the information that can be extracted from the brain,” he said. That will not only increase scientific understanding of how the brain works, but also open up new possibilities for helping people which, he says, is the whole point of brain implants. They will still, however, record from only a small subset of the hundred billion neurons in the brain.

As the recording devices improve, researchers are testing what brain implants can do beside move a cursor or a mechanical limb. Neuroscientist Sergey Stavisky and neurosurgeon David Brandman, who co-direct the Neuroprosthetics Lab at the University of California, Davis, are using BCIs to restore paralyzed people’s ability to speak. As part of the research consortium BrainGate (https://www.braingate.org/), they have implanted electrodes into the brain of a patient with ALS whose speech had been reduced to mostly unintelligible grunts.

The electrodes record neural activity from approximately 250 neurons as he attempts to speak, taking readings every 80ms and learning how he is trying to move the muscles that control his lips, jaw, larynx, and diaphragm for each of 39 phonemes, the units of sound that come together to make words. A language model, like that behind ChatGPT, looks at the phonemes and estimates what the word is most likely to be. A second language model examines those words in the context of a phrase and decides whether they are what was intended. The user signals to the computer when it has produced the correct phrase.

With just a couple hours of training, the patient was able to accurately produce sentences that were displayed on a computer screen, then read aloud by speech-to-text software designed to sound like his own voice. The researchers hope eventually to design a brain-to-voice version, where the intentions are instantaneously translated into sound, complete with natural inflections. That problem is harder than brain-to-text, Stavisky said, because the machine learning algorithms would have to decode the relatively noisy signal within a shorter time than with text, and because the language model would have less context to decide which words are correct. “If you’re decoding text, you can use that statistical structure to basically make up for some of the inaccuracies in brain measurement,” he explained, an advantage that is lost with the faster process of producing speech.

In and out

Other groups are working in the reverse direction, trying to send signals into the brain. Philip Troyk, a biomedical engineer at the Illinois Institute of Technology, is working on a visual prosthesis that takes information recorded by a camera mounted to eyeglasses and, bypassing damage to the eye, sends the data directly to a person’s brain. The person does not see an actual image, just a small number of bright dots or lines called phosphenes. In a video demonstrating the experience, a user asked to pick up a plate scans across a table until he sees a small grouping of phosphenes that might indicate the object, then uses his hand to confirm it is what he is seeking.

“At the stage of where the technology is, we’re trying to take someone who’s totally blind and make them legally blind, where legal blindness actually is some reasonable vision,” Troyk said. He implanted his first patient in 2022 with about 400 electrodes. While each electrode stimulates a group of neurons, that is a small subset of the 350 million involved in the visual cortex. The signal is also crude compared to what the eye produces, sending a jolt of current to several neurons simultaneously, without the temporal variations in biological vision.

Robert Gaunt, a biomedical engineer at the University of Pittsburgh, is working on bidirectional signals, where a patient controls a prosthetic hand with his thoughts while physical sensations from the hand provide feedback that, for instance, helps the user adjust their grip. Providing a sense of touch allowed a patient performing tasks, such as grasping and lifting an “object”—actually a picture on a computer screen—to cut the time it took to do those tasks in half. The feeling is often unlike natural touch, Gaunt said, sometimes registering as a tingling. He is now working to deliver more nuanced signals that allow the patient to feel more subtle touch, such as an object moving across the hand. Eventually, he says, artificial intelligence may allow researchers to create new stimulus signals that work even better.

While allowing paralyzed people to control computers is helpful in giving them independence, “What we want to work on is real, physical interactions with the world,” Gaunt said. “That’s why we think touch is so important.”

While some BCIs could take years to perfect, many of the people who could benefit from them do not have that long to wait. Brandman is hopeful that in five years or so, he will be able to offer ALS patients surgery to restore their speech. “There are patients in need today that want to be able to tell their loved ones that they love them, or that want to be able to tell the healthcare worker, ‘I’m in pain’,” he said.

Burkhart, who is president of the North American Spinal Cord Injury Consortium, says BCI technology has the potential to be life-changing for many people. “If we can get to the point where these devices are available to people in the general public, outside of clinical trials, you’re going to see how much it can really improve someone’s quality of life,” he said.

Further Reading

Barry, M.P., Sadeghi, R., Towle, V.L., Stipp, K., et al.
Preliminary visual function for the first human with the Intracortical Visual Prosthesis (ICVP), Investigative Ophthalmology & Visual Science, 2023

Fisher, L.E., Gaunt, R., and He, H.
Sensory restoration for improved motor control of prostheses, Current Opinion in Biomedical Engineering, 2023, https://doi.org/10.1016/j.cobme.2023.100498

Sharma, G., Friedenberg, D.A., Annetta, N., Glenn, B., et al.
Using an Artificial Neural Bypass to Restore Cortical Control of Rhythmic Movements in a Human with Quadriplegia, Nature, 2016, https://doi:10.1038/srep33807

Metzger, S.L., Littlejohn, K.T., Silva, A.B., Moses, D.A., et al.
A high-performance neuroprosthesis for speech decoding and avatar control, Nature, 2023, https://doi.org/10.1038/s41586-023-06443-4

Card, N.S., Wairagkar, M., Iacobacci, C., et al.
An Accurate and Rapidly Calibrating Speech Neuroprosthesis, JAMA, 2024, https://www.nejm.org/doi/10.1056/NEJMoa2314132

New Brain-Computer Interface Allows Man with ALS to ‘Speak’ Again
https://www.youtube.com/watch?v=thPhBDVSxz0&t=275s

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More