Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/180824
Title: Leveraging temporal dependency for cross-subject-MI BCIs by contrastive learning and self-attention
Authors: Sun, Hao
Ding, Yi
Bao, Jianzhu
Qin, Ke
Tong, Chengxuan
Jin, Jing
Guan, Cuntai
Keywords: Computer and Information Science
Issue Date: 2024
Source: Sun, H., Ding, Y., Bao, J., Qin, K., Tong, C., Jin, J. & Guan, C. (2024). Leveraging temporal dependency for cross-subject-MI BCIs by contrastive learning and self-attention. Neural Networks, 178, 106470-. https://dx.doi.org/10.1016/j.neunet.2024.106470
Project: A20G8b0102 
Journal: Neural Networks
Abstract: Brain-computer interfaces (BCIs) built based on motor imagery paradigm have found extensive utilization in motor rehabilitation and the control of assistive applications. However, traditional MI-BCI systems often exhibit suboptimal classification performance and require significant time for new users to collect subject-specific training data. This limitation diminishes the user-friendliness of BCIs and presents significant challenges in developing effective subject-independent models. In response to these challenges, we propose a novel subject-independent framework for learning temporal dependency for motor imagery BCIs by Contrastive Learning and Self-attention (CLS). In CLS model, we incorporate self-attention mechanism and supervised contrastive learning into a deep neural network to extract important information from electroencephalography (EEG) signals as features. We evaluate the CLS model using two large public datasets encompassing numerous subjects in a subject-independent experiment condition. The results demonstrate that CLS outperforms six baseline algorithms, achieving a mean classification accuracy improvement of 1.3 % and 4.71 % than the best algorithm on the Giga dataset and OpenBMI dataset, respectively. Our findings demonstrate that CLS can effectively learn invariant discriminative features from training data obtained from non-target subjects, thus showcasing its potential for building models for new users without the need for calibration.
URI: https://hdl.handle.net/10356/180824
ISSN: 0893-6080
DOI: 10.1016/j.neunet.2024.106470
Schools: School of Computer Science and Engineering 
Rights: © 2024 Published by Elsevier Ltd. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

Page view(s)

63
Updated on Jan 16, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.