Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/175057
Title: A novel transformer for attention decoding using EEG
Authors: Lee, Joon Hei
Keywords: Computer and Information Science
Issue Date: 2024
Publisher: Nanyang Technological University
Source: Lee, J. H. (2024). A novel transformer for attention decoding using EEG. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175057
Project: SCSE23-0162 
Abstract: Electroencephalography (EEG) attention classification plays a crucial role in brain-computer interface (BCI) applications. This paper introduces EEG-PatchFormer, a novel deep learning model leveraging transformers to achieve superior EEG attention decoding. We posit that transformers’ strength in capturing long-range temporal dependencies, coupled with their recent success on spatial data, makes them ideally suited for processing EEG signals. We begin by outlining a pilot study investigating the impact of various patching strategies on the classification accuracy of a transformer-based network. This study revealed significant performance variations across patching methods, emphasising the importance of optimal patching for model efficacy. We then showcase the proposed EEG-PatchFormer architecture. Key modules include a temporal convolutional neural network (CNN), a pointwise convolutional layer, and separate patching modules to handle global and local spatial features, as well as temporal features. The model then features a transformer module, and culminates in a fully-connected classifier. Finally, EEG-PatchFormer’s performance across various evaluation experiments is discussed. Extensive evaluation on a publicly available cognitive attention dataset demonstrated that EEG-PatchFormer surpasses existing state-of-the-art benchmarks in terms of mean classification accuracy, area under the ROC curve (AUC), and macro-F1 score. Hyperparameter tuning and ablation studies were carried out to further optimise, and understand the contribution of, individual components. Overall, this project establishes EEG-PatchFormer as a state-of-the-art model for EEG attention decoding, with promising applications for BCI.
URI: https://hdl.handle.net/10356/175057
Schools: School of Computer Science and Engineering 
Fulltext Permission: embargo_restricted_20250502
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Report_Lee Joon Hei_Amended.pdf
  Until 2025-05-02
3.66 MBAdobe PDFUnder embargo until May 02, 2025

Page view(s) 20

785
Updated on Mar 16, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.