4TechViews

📈 Discover trending tech, solutions, and products
4TechViews
  • News
  • Tech events
  • Tech solutions
  • Digital business
  • Professional development
    • IT courses
      • Artificial intelligence
      • Data analysis
      • Others IT courses
    • Digital business courses
    • Career handbook
    • Jobs – Recruitment
    • Good books
  • Forum
  • More
    • Discovery
    • Innovation
    • Trending products
    • Tips and life hacks
    • Knowledge sharing
    • Learning quotes
    • Featured posts
    • Entertainment
  • Account
    • Logout
    • Register
Home
Featured posts
US scientists decode inner thoughts with 74% accuracy using BCI

US scientists decode inner thoughts with 74% accuracy using BCI

Michael Thompson Featured posts Innovation News 15/08/2025 28/09/2025 1,284

[Hot]   Tìm hiểu về AI & Data Science tại AI4vietnam

[Upcoming]   Tổng hợp các khóa học cho người làm IT

Stanford University scientists have achieved a major milestone in neuroscience by successfully decoding inner speech — the silent thoughts in a person’s head — with an accuracy rate of up to 74%. This breakthrough offers new hope for people with severe speech and motor impairments.

Breakthrough in decoding silent thoughts

“This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,” said Erin Kunz, lead author from Stanford University.

The new brain-computer interface (BCI) translates a person’s inner thoughts into words and can be activated only when they think a specific mental password. For people with severe impairments, this could make communication easier and more natural.

From movement to inner speech

Brain-computer interfaces are not entirely new. They have long enabled direct communication between the brain and external devices, helping people with disabilities control prosthetic limbs by decoding movement-related brain signals.

Earlier research showed BCIs could decode attempted speech in people with paralysis, interpreting brain activity linked to trying to speak. While faster than eye-tracking systems, decoding attempted speech can still be slow and physically demanding for people with limited muscle control.

This limitation inspired the Stanford team to explore decoding inner speech — the silent internal voice we all have — as it could be easier and faster.

“If you just have to think about speech instead of actually trying to speak, it’s potentially easier and faster for people,” explained Benyamin Meschede-Krasa, the paper’s co-first author.

How the experiment worked

The study involved four participants with severe paralysis caused by conditions like amyotrophic lateral sclerosis (ALS) or brainstem stroke.

Stanford’s brain-computer interface decodes inner speech with 74% accuracy from a 125,000-word vocabulary, aiding communication.

Researchers implanted microelectrodes into the motor cortex, the brain region controlling speech. Participants were instructed to either attempt speaking or imagine words.

Both actions activated similar brain regions and produced comparable neural patterns, though inner speech signals were weaker. Still, the patterns were distinct enough for artificial intelligence to interpret imagined words.

AI models trained on this data could decode sentences from a vocabulary of up to 125,000 words with 74% accuracy. The system even detected unplanned thoughts, such as numbers when participants counted objects on a screen.

Mental password unlocks the system

Although attempted and inner speech produce similar patterns, they are distinct enough for BCIs to tell them apart. This allows the system to ignore inner speech unless intentionally activated.

To give users control, researchers developed a mental password feature. Individuals could unlock the inner-speech decoding function by thinking of a pre-chosen keyword.

In experiments, participants used the phrase “chitty chitty bang bang” to activate the system, which recognized the password with over 98% accuracy.

Future of communication restoration

While current technology can’t flawlessly decode spontaneous inner speech, scientists are optimistic. With better sensors and algorithms, BCIs may one day restore communication as fluent and natural as ordinary conversation.

The study was reported in the journal Cell.

The short URL of the present article is: https://4techviews.net/hxck
Trending: Fans of Summer (5)
Trending: Fans of Summer (5)
Smart Gadget Finds
Smart Gadget Finds
Tweet

Recent Posts

  • Mac Mini M5 redefines desktop power with new AI-focused chip
    Mac Mini M5 redefines desktop power with new AI-focused chip
    13/11/2025 0
  • Japan’s new osmotic power plant turns saltwater into clean energy
    Japan’s new osmotic power plant turns saltwater into clean energy
    10/11/2025 0
  • Meta AI layoffs: Alexandr Wang explains the 600 job cuts
    Meta’s Alexandr Wang explains why 600 AI staff were laid off
    23/10/2025 0
  • Apple M5 chip powers faster MacBook Pro and iPad Pro
    Apple refreshes MacBook Pro, iPad Pro and Vision Pro with faster M5 chip
    16/10/2025 0

Related posts

  • Researchers discovered 34 Windows Drivers with security vulnerabilities
    Researchers discovered 34 Windows Drivers with security vulnerabilities
    06/11/2023 0
  • Amazon to invest $100bn in AI this year
    07/02/2020 0
  • The Walker S1 Humanoid Robot is Officially Working At Factories
    The walker S1 humanoid robot is officially working at factories
    10/12/2024 0
  • Klarna loses $40B after replacing staff with AI, wants humans back
    Klarna loses $40B after replacing staff with AI, wants humans back
    24/05/2025 0

Search

Share the post

DMCA.com Protection Status

Most viewed posts

  • China builds an underwater data center with a capacity of 6 million PCs
    China builds an underwater data center with a capacity of 6 million PCs
  • Landing AI launches custom LVM Models specifically for businesses
    Landing AI launches custom LVM Models specifically for businesses
  • Google officially launches Gemini, will the new AI overshadow ChatGPT?
    Google officially launches Gemini, will the new AI overshadow ChatGPT?
  • Meta ra mắt Imagine with Meta và Make-A-Video, cạnh tranh với Google AI - 4TechViews
    Meta introduced Imagine with Meta and Make-A-Video, competing with Google AI
  • Microsoft is under investigation related to its $10 billion investment in OpenAI
    Microsoft is under investigation related to its $10 billion investment in OpenAI

Social links

  • Facebook
  • Tiktok
  • Youtube
  • Linkedin
  • X-Twitter
  • Reddit
  • Pinterest


Facebook fanpage

4TechViews

About Us

Introduction - Contact - Privacy Policy

Links

Partners

4TechViews
Copyright © 2025