A novel transformer for attention decoding using EEG

Electroencephalography (EEG) attention classification plays a crucial role in brain-computer interface (BCI) applications. This paper introduces EEG-PatchFormer, a novel deep learning model leveraging transformers to achieve superior EEG attention decoding. We posit that transformers’ strength in ca...

Full description

Saved in:
Bibliographic Details
Main Author: Lee, Joon Hei
Other Authors: Guan Cuntai
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2024
Subjects:
EEG
Online Access:https://hdl.handle.net/10356/175057
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Electroencephalography (EEG) attention classification plays a crucial role in brain-computer interface (BCI) applications. This paper introduces EEG-PatchFormer, a novel deep learning model leveraging transformers to achieve superior EEG attention decoding. We posit that transformers’ strength in capturing long-range temporal dependencies, coupled with their recent success on spatial data, makes them ideally suited for processing EEG signals. We begin by outlining a pilot study investigating the impact of various patching strategies on the classification accuracy of a transformer-based network. This study revealed significant performance variations across patching methods, emphasising the importance of optimal patching for model efficacy. We then showcase the proposed EEG-PatchFormer architecture. Key modules include a temporal convolutional neural network (CNN), a pointwise convolutional layer, and separate patching modules to handle global and local spatial features, as well as temporal features. The model then features a transformer module, and culminates in a fully-connected classifier. Finally, EEG-PatchFormer’s performance across various evaluation experiments is discussed. Extensive evaluation on a publicly available cognitive attention dataset demonstrated that EEG-PatchFormer surpasses existing state-of-the-art benchmarks in terms of mean classification accuracy, area under the ROC curve (AUC), and macro-F1 score. Hyperparameter tuning and ablation studies were carried out to further optimise, and understand the contribution of, individual components. Overall, this project establishes EEG-PatchFormer as a state-of-the-art model for EEG attention decoding, with promising applications for BCI.