May 20, 2025 - News

Apple Adds Neural Input to iOS

iPhones powered by thought.
A man is wearing a virtual reality headset and viewing app icons
Synchron

Neural inputs go native.

The news: Apple introduced a BCI HID protocol, enabling brain-computer interfaces (BCIs) to control iPhones, iPads, and Vision Pro headsets via neural signals, no touch or voice required.

Requiring Synchron’s Stentrode implant, users with paralysis will be able to navigate screens and select icons with their thoughts.

Next-gen UI. While the initial use case targets motor impairment, Apple’s move marks a paradigm shift toward brain signals as a new input layer, alongside touch, voice, and motion.

Like Bluetooth hearing aids or Apple Watch biometrics, BCI HID could eventually embed neurotech into the Apple ecosystem, with OS-level support unlocking seamless pairing, continuous syncing, and new app categories.

Think fast. As the $721B neuroscience market accelerates, cognitive data is going DTC.

  • Muse uses EEG feedback to train focus and relaxation.
  • Myndspan’s clinics will apply MEG scans to monitor brain aging and injury.
  • Kernel’s helmet quantifies cognitive performance with noninvasive neuroimaging.

Elsewhere, Synchron competitor Neuralink is developing brain implants designed to restore function and augment the mind, while Atom Limbs’ uses AI to translate phantom limb feelings to prosthetic control.

Looking ahead: By hardwiring BCIs into iOS, Apple isn’t just improving accessibility, it’s hinting at the future of human-device interaction, making thought the primary input.

Fitt Insider
Fitt Insider
linked in for author
The future of health and wellness in one newsletter

Subscribe for insights on the wellness economy, gyms and studios, preventative healthcare, wearable tech, and more

No thanks.