[공지사항]을 빙자한 안부와 근황 
Show more

Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

A foundation model with multi-variate parallel attention to generate neuronal activity

Created by
  • Haebom

Author

Francesco Carzaniga, Michael Hersche, Abu Sebastian, Kaspar Schindler, Abbas Rahimi

Outline

In this paper, we propose a novel self-attention mechanism, multivariate parallel attention (MVPA), to address the challenges of learning multivariate time series data with diverse channel configurations, especially in clinical domains such as intracranial electroencephalography (iEEG), where channel configurations vary greatly across patients. MVPA decouples content, time, and spatial attention to model time series data with diverse channel counts and configurations flexibly, generalizable, and efficiently. Using MVPA, we build a generative model for human electrophysiology, MVPFormer, trained to predict iEEG signal changes from diverse patients. To support this work and facilitate future research, we release the SWEC iEEG dataset (~10,000 hours of recorded data), the largest publicly available iEEG dataset to date. Leveraging MVPA, MVPFormer achieves strong cross-patient generalization performance, achieves expert-level performance in seizure detection, and outperforms state-of-the-art Transformer-based models on SWEC, MAYO, and FNUSA datasets. We also validate the performance of MVPA on standard time series prediction and classification tasks, demonstrating that it performs equally or better than existing attention-based models. In conclusion, this paper establishes MVPA as a general-purpose attention mechanism for heterogeneous time series, and MVPFormer as the first open-source, open-weighted, and open-data iEEG-based model with state-of-the-art clinical performance. The code is available at https://github.com/IBM/multi-variate-parallel-transformer , and the SWEC iEEG dataset is available at https://mb-neuro.medical-blocks.ch/public_access/databases/ieeg/swec_ieeg .

Takeaways, Limitations

Takeaways:
An effective modeling method for multivariate time series data with various channel configurations is presented (MVPA).
Providing state-of-the-art open source based model (MVPFormer) for iEEG analysis.
Enabling research through the release of a large-scale open iEEG dataset (SWEC).
Verification of the versatility of MVPA in various time-series tasks.
Limitations:
The generalizability of MVPA requires further validation in various clinical settings.
Despite the diversity of the SWEC dataset, there is a possibility of over- or under-representation of certain population groups.
Further analysis of the computational cost and training time of MVPFormer is needed.
👍