Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

BrainBERT: Self-supervised representation learning for intracranial recordings

About

We create a reusable Transformer, BrainBERT, for intracranial recordings bringing modern representation learning approaches to neuroscience. Much like in NLP and speech recognition, this Transformer enables classifying complex concepts, i.e., decoding neural data, with higher accuracy and with much less data by being pretrained in an unsupervised manner on a large corpus of unannotated neural recordings. Our approach generalizes to new subjects with electrodes in new positions and to unrelated tasks showing that the representations robustly disentangle the neural signal. Just like in NLP where one can study language by investigating what a language model learns, this approach opens the door to investigating the brain by what a model of the brain learns. As a first step along this path, we demonstrate a new analysis of the intrinsic dimensionality of the computations in different areas of the brain. To construct these representations, we combine a technique for producing super-resolution spectrograms of neural data with an approach designed for generating contextual representations of audio by masking. In the future, far more concepts will be decodable from neural recordings by using representation learning, potentially unlocking the brain like language models unlocked language.

Christopher Wang, Vighnesh Subramaniam, Adam Uri Yaari, Gabriel Kreiman, Boris Katz, Ignacio Cases, Andrei Barbu• 2023

Related benchmarks

TaskDatasetResultRank
Speech DecodingsEEG
Cross-Entropy4.619
16
Speech DecodingsEEG signals 61-word
Accuracy7.5
16
iEEG neural decodingNeuroprobe binary-label 1s (overall)
AUROC0.586
5
iEEG neural decodingMAYO 6s
AUROC0.748
5
Neural signal decodingNeuroprobe iEEG
Overall AUROC0.586
5
Sleep ClassificationDREAMT
AUROC0.958
4
Skin Temperature RegressionDREAMT
RMSE0.734
4
Showing 7 of 7 rows

Other info

Follow for update