Stanford University scientists have achieved a major milestone in neuroscience by successfully decoding inner speech — the silent thoughts in a person’s head — with an accuracy rate of up to 74%. This breakthrough offers new hope for people with severe speech and motor impairments.
“This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,” said Erin Kunz, lead author from Stanford University.
The new brain-computer interface (BCI) translates a person’s inner thoughts into words and can be activated only when they think a specific mental password. For people with severe impairments, this could make communication easier and more natural.
From movement to inner speech
Brain-computer interfaces are not entirely new. They have long enabled direct communication between the brain and external devices, helping people with disabilities control prosthetic limbs by decoding movement-related brain signals.
Earlier research showed BCIs could decode attempted speech in people with paralysis, interpreting brain activity linked to trying to speak. While faster than eye-tracking systems, decoding attempted speech can still be slow and physically demanding for people with limited muscle control.
This limitation inspired the Stanford team to explore decoding inner speech — the silent internal voice we all have — as it could be easier and faster.
“If you just have to think about speech instead of actually trying to speak, it’s potentially easier and faster for people,” explained Benyamin Meschede-Krasa, the paper’s co-first author.
How the experiment worked
The study involved four participants with severe paralysis caused by conditions like amyotrophic lateral sclerosis (ALS) or brainstem stroke.
Researchers implanted microelectrodes into the motor cortex, the brain region controlling speech. Participants were instructed to either attempt speaking or imagine words.
Both actions activated similar brain regions and produced comparable neural patterns, though inner speech signals were weaker. Still, the patterns were distinct enough for artificial intelligence to interpret imagined words.
AI models trained on this data could decode sentences from a vocabulary of up to 125,000 words with 74% accuracy. The system even detected unplanned thoughts, such as numbers when participants counted objects on a screen.
Mental password unlocks the system
Although attempted and inner speech produce similar patterns, they are distinct enough for BCIs to tell them apart. This allows the system to ignore inner speech unless intentionally activated.
To give users control, researchers developed a mental password feature. Individuals could unlock the inner-speech decoding function by thinking of a pre-chosen keyword.
In experiments, participants used the phrase “chitty chitty bang bang” to activate the system, which recognized the password with over 98% accuracy.
Future of communication restoration
While current technology can’t flawlessly decode spontaneous inner speech, scientists are optimistic. With better sensors and algorithms, BCIs may one day restore communication as fluent and natural as ordinary conversation.
The study was reported in the journal Cell.
