TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition
The high temporal resolution and the asymmetric spatial activations are essential attributes of electroencephalogram (EEG) underlying emotional processes in the brain. To learn the temporal dynamics and spatial asymmetry of EEG towards accurate and generalized emotion recognition, we propose TScepti...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/179071 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-179071 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1790712024-07-17T08:38:31Z TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition Ding, Yi Robinson, Neethu Zhang, Su Zeng, Qiuhao Guan, Cuntai College of Computing and Data Science School of Computer Science and Engineering Computer and Information Science Deep learning Convolutional neural networks Electroencephalography Emotion recognition The high temporal resolution and the asymmetric spatial activations are essential attributes of electroencephalogram (EEG) underlying emotional processes in the brain. To learn the temporal dynamics and spatial asymmetry of EEG towards accurate and generalized emotion recognition, we propose TSception, a multi-scale convolutional neural network that can classify emotions from EEG. TSception consists of dynamic temporal, asymmetric spatial, and high-level fusion layers, which learn discriminative representations in the time and channel dimensions simultaneously. The dynamic temporal layer consists of multi-scale 1D convolutional kernels whose lengths are related to the sampling rate of EEG, which learns the dynamic temporal and frequency representations of EEG. The asymmetric spatial layer takes advantage of the asymmetric EEG patterns for emotion, learning the discriminative global and hemisphere representations. The learned spatial representations will be fused by a high-level fusion layer. Using more generalized cross-validation settings, the proposed method is evaluated on two publicly available datasets DEAP and MAHNOB-HCI. The performance of the proposed network is compared with prior reported methods such as SVM, KNN, FBFgMDM, FBTSC, Unsupervised learning, DeepConvNet, ShallowConvNet, and EEGNet. TSception achieves higher classification accuracies and F1 scores than other methods in most of the experiments. The codes are available at: https://github.com/yi-ding-cs/TSception. Agency for Science, Technology and Research (A*STAR) Published version This work was supported in part by RIE2020 AME Programmatic Fund, Singapore under Grant A20G8b0102. 2024-07-17T08:38:30Z 2024-07-17T08:38:30Z 2022 Journal Article Ding, Y., Robinson, N., Zhang, S., Zeng, Q. & Guan, C. (2022). TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition. IEEE Transactions On Affective Computing, 14(3), 2238-2250. https://dx.doi.org/10.1109/TAFFC.2022.3169001 1949-3045 https://hdl.handle.net/10356/179071 10.1109/TAFFC.2022.3169001 3 14 2238 2250 en A20G8b0102 IEEE Transactions on Affective Computing © 2022 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Computer and Information Science Deep learning Convolutional neural networks Electroencephalography Emotion recognition |
spellingShingle |
Computer and Information Science Deep learning Convolutional neural networks Electroencephalography Emotion recognition Ding, Yi Robinson, Neethu Zhang, Su Zeng, Qiuhao Guan, Cuntai TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition |
description |
The high temporal resolution and the asymmetric spatial activations are essential attributes of electroencephalogram (EEG) underlying emotional processes in the brain. To learn the temporal dynamics and spatial asymmetry of EEG towards accurate and generalized emotion recognition, we propose TSception, a multi-scale convolutional neural network that can classify emotions from EEG. TSception consists of dynamic temporal, asymmetric spatial, and high-level fusion layers, which learn discriminative representations in the time and channel dimensions simultaneously. The dynamic temporal layer consists of multi-scale 1D convolutional kernels whose lengths are related to the sampling rate of EEG, which learns the dynamic temporal and frequency representations of EEG. The asymmetric spatial layer takes advantage of the asymmetric EEG patterns for emotion, learning the discriminative global and hemisphere representations. The learned spatial representations will be fused by a high-level fusion layer. Using more generalized cross-validation settings, the proposed method is evaluated on two publicly available datasets DEAP and MAHNOB-HCI. The performance of the proposed network is compared with prior reported methods such as SVM, KNN, FBFgMDM, FBTSC, Unsupervised learning, DeepConvNet, ShallowConvNet, and EEGNet. TSception achieves higher classification accuracies and F1 scores than other methods in most of the experiments. The codes are available at: https://github.com/yi-ding-cs/TSception. |
author2 |
College of Computing and Data Science |
author_facet |
College of Computing and Data Science Ding, Yi Robinson, Neethu Zhang, Su Zeng, Qiuhao Guan, Cuntai |
format |
Article |
author |
Ding, Yi Robinson, Neethu Zhang, Su Zeng, Qiuhao Guan, Cuntai |
author_sort |
Ding, Yi |
title |
TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition |
title_short |
TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition |
title_full |
TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition |
title_fullStr |
TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition |
title_full_unstemmed |
TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition |
title_sort |
tsception: capturing temporal dynamics and spatial asymmetry from eeg for emotion recognition |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/179071 |
_version_ |
1814047231645319168 |