AI systems now track intentions forming in the brain
A major breakthrough comes from clinical trials in which paralyzed volunteers use implanted devices to control keyboards, robotic limbs and synthetic speech. One participant, Nancy Smith, regained the ability to play simple melodies through an implant that analyzed her brain activity.
As she imagined pressing piano keys, the system detected her intention before she consciously acted, making the music feel as if it played itself. Researchers say this occurred because the implant captured preconscious planning signals in her posterior parietal cortex, a region involved in reasoning and attention.
Teams working in this area report that these signals can be mixed and rich, containing information not only about movement but also about decision making, internal dialogue and moment to moment intentions. Several groups have already shown that they can decode fragments of internal speech and track how volunteers evaluate card choices during a blackjack game.
The new frontier combines implants with AI. Developers at companies such as Synchron and academic teams at Caltech have demonstrated that AI models trained on extensive neural recordings can identify subtle patterns previously dismissed as noise.
In unpublished work, Synchron researchers found that their system could detect a user’s error moments before the user realized it, offering a preview of how BCIs might intervene in real time. This creates a practical dilemma: devices could automatically correct mistakes and speed up performance, but doing so would mean acting on behalf of users without explicit awareness or consent.
At the same time, consumer neurotech is advancing fast. Electroencephalography (EEG) based headsets now use AI to improve signal quality and offer feedback on focus, stress or alertness. Although their recordings are far less precise than those of implanted devices, they can still reveal how people react to specific stimuli.
As clinical BCIs move closer to approval and as consumer systems expand, experts say the central challenge is shifting. Earlier debates focused on keeping brain data private. Now, with AI decoding signals that reflect preconscious intentions, the focus is turning to how these systems might shape user behavior. Some fear that AI assisted BCIs, especially those that suggest or draft communication, could begin to influence what users express and eventually how they think.
Earlier, Qazinform News Agency reported that Baidu unveiled next-gen AI chips.