bci
7 articles about bci in AI news
Survey Benchmarks Four Approaches to Synthetic Brain Signal Generation for BCI Data Scarcity
A comprehensive survey categorizes and benchmarks four methodological approaches to generating synthetic brain signals for BCIs, addressing data scarcity and privacy constraints. The authors provide an open-source codebase for comparing knowledge-based, feature-based, model-based, and translation-based generative algorithms.
Neuralink Patient Plays World of Warcraft Using Brain-Computer Interface, Demonstrating Complex Control
A Neuralink implant recipient has reportedly played World of Warcraft using only thought-based control. The demonstration highlights the BCI's ability to manage complex, multi-action gameplay.
Sabi Cap: 100k-Sensor EEG Hat Decodes Internal Speech at 30 WPM
Sabi released the Sabi Cap, a wearable EEG beanie with 70k-100k biosensors and a brain foundation model trained on 100k hours of neural data. It decodes internal speech to text at ~30 WPM and enables cursor control via intention.
Sabicap Develops Brain Wearable to Decode Imagined Speech into Text
Sabicap is developing a brain wearable with tens of thousands of sensors to decode imagined speech into text. The company, backed by Vinod Khosla, aims to create a system that works across users with minimal calibration for broad adoption.
Neuralink & ElevenLabs Demo AI Voice Restoration for Brain Implant User
Neuralink and voice AI firm ElevenLabs demonstrated a system that generates speech for a Neuralink patient who lost their voice. The demo shows a brain-computer interface decoding intended speech into synthetic voice in real-time.
NeuroSkill: MIT's Breakthrough AI Agent Reads Your Mind Before You Ask
MIT researchers have developed NeuroSkill, a revolutionary AI system that integrates brain-computer interfaces with foundation models to create proactive agents that respond to implicit human cognitive and emotional states, running fully offline on edge devices.
Brain-OF: The First Unified AI Model That Reads Multiple Brain Signals Simultaneously
Researchers have developed Brain-OF, the first omnifunctional foundation model that jointly processes fMRI, EEG, and MEG brain signals. This unified approach overcomes previous single-modality limitations by integrating complementary spatiotemporal data through innovative architecture and pretraining techniques.