Decoding mental attention from EEG using deep neural networks

Decoding mental attention from electroencephalogram (EEG) via deep learning has gained popularity amongst Brain-Computer Intefaces (BCI) researches over recent years. Many are hoping to build a model that is reliable and accurate to be commercialized in the medical field and thus help many in their...

Full description

Saved in:
Bibliographic Details
Main Author: Phuah, Jethro An Ping
Other Authors: Guan Cuntai
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/165877
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Decoding mental attention from electroencephalogram (EEG) via deep learning has gained popularity amongst Brain-Computer Intefaces (BCI) researches over recent years. Many are hoping to build a model that is reliable and accurate to be commercialized in the medical field and thus help many in their struggle against Attention Deficit Hyperactivity Disorder (ADHD). To gain deeper insights into this field of specialization, this report scrutinized and explored various deep learning models before zooming into Deep Convolutional Neural Network (DeepConvNet) which currently boasts an accuracy of 77.9%. This report dug deeper and examine the factors that lead to its success in model’s performance when compared to its peers like EEGNet or TSception. This process is done through a series of experimentation and fine-tuning, to find the best optimal hyperparameters for the model and provide a detailed analysis for future BCI researchers so that future works can be more streamlined and focused. Our experiments indicate that Depth, Width and Activation Function of the model (in decreasing order) are a few factors that has the largest impact on DeepConvNet’s performance, with the highest absolute change in classification accuracy of 7.4753% against its default parameters. To benefit the potential online usage of the deep learning model, I further explored pruning technologies to compress the model.