Breakthrough in Brain-Computer Interfaces: Implantable Electrodes Link to Computers, Enabling AI to Help Patients 'Speak'!

A recent study reveals significant advancements in brain-computer interface (BCI) technology, driven by the application of artificial intelligence. Doctors at the University of California, Davis, implanted electrodes in the outer layer of the brain of ALS patient Casey Harrell, successfully decoding his attempts at verbal communication. These results surpassed expectations, setting a new benchmark for implanted language decoders and showcasing their immense potential for patients with speech impairments.

Amyotrophic lateral sclerosis (ALS) is a progressive neurodegenerative disease that affects the nerve cells controlling voluntary muscle movement, leading to gradual muscle atrophy and ultimately hindering the patient's ability to walk, speak, and swallow. As the disease progresses, patients typically experience a decline in their speech capabilities.

In Harrell's treatment, the medical team implanted four electrode arrays—double the number used in previous studies. Each array contained 64 probes capable of capturing the neural signals generated when Harrell attempted to speak. Just three weeks post-surgery, the research team successfully connected the device to a computer. After a brief learning phase, the device recorded his intended speech with an impressive accuracy of 99.6%.

Researchers highlighted that the device circumvented Harrell’s condition, focusing on the area's motor cortex responsible for generating speech commands. Neuroscientist Sergey Stavisky stated, “The key lies in the precise arrangement of more electrodes directly targeting the brain regions most involved in language expression.”

By the second day of trials, the device achieved 90% accuracy in processing a vocabulary of 125,000 words, allowing it to generate sentences written by Harrell in his own voice for the first time. Researchers utilized recordings and podcast interviews from before his illness to create an AI simulation of his familiar voice. With continued training, the device's performance improved, enabling Harrell to speak nearly 6,000 distinct words over eight months, maintaining a 97.5% accuracy rate.

This achievement marks a significant improvement over many mobile apps that transcribe speech, which typically operate at around 75% accuracy. Unlike Elon Musk's Neuralink device, this implant enables patients to explore more complex linguistic realms and engage in everyday conversations with family and friends.

The success of this research can be attributed to language AI tools like ChatGPT. At every moment, the implant gathers signals from neuronal activity, converting them into speech units that are then assembled into words and sentences, effectively articulating what Harrell intended to express.

However, it remains uncertain whether this technology will be equally effective for patients with more severe paralysis. Although Harrell has experienced significant declines in speech ability, he has not completely lost it. Additionally, the high costs associated with this technology may pose a barrier for most patients.

Most people like

Find AI tools in YBX

Related Articles
Refresh Articles