Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues

Does viewing food pictures invoke any taste-related response? Applications of the answer to this question lie in various fields such as neuromarketing. The intersection between neurotechnology and gastronomy, my research investigates using multimodal neural, physiological, and physical sensors to de...

Full description

Saved in:
Bibliographic Details
Main Author: Jaiswal, Arnav
Other Authors: Guan Cuntai
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2024
Subjects:
EEG
XAI
Online Access:https://hdl.handle.net/10356/175194
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-175194
record_format dspace
spelling sg-ntu-dr.10356-1751942024-04-19T15:43:16Z Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues Jaiswal, Arnav Guan Cuntai School of Computer Science and Engineering Center for Brain-Computing Research CTGuan@ntu.edu.sg Computer and Information Science EEG Brain Taste Visual cue Congruent XAI Multisensory Does viewing food pictures invoke any taste-related response? Applications of the answer to this question lie in various fields such as neuromarketing. The intersection between neurotechnology and gastronomy, my research investigates using multimodal neural, physiological, and physical sensors to decode and classify taste perceptions with or without visual stimuli. This study aims to build a multi- stage experiment design with cognitive, smell, and taste profiling to evaluate how gastronomic visual stimuli are presented concurrently while administering different taste stimuli. Our experiment is trying to answer how EEG combined with other sensors improves the classification accuracy of the five basic tastes: sweet, sour, salty, bitter, and neutral in the presence of food or non-food images. One of our research hypotheses is "Does the congruency of the visual and taste stimuli increase decoding and classification performance among four tastes and control (neutral)? From this regard, I designed an experiment encompassing the above conditions and collected multimodal data in a controlled lab setting using standard taste testing kits with nine healthy subjects. From collected data, I analyze and evaluate both data-driven deep learning approaches and eXplainable Artificial Intelligence (XAI) with causal machine learning to better understand the intricate relationship and interplays between tastes, cognition, and behaviors. I believe this research investigation will reveal promising results in EEG taste decoding while XAI gives meaningful insights about discriminative EEG among tastes. Bachelor's degree 2024-04-19T12:28:58Z 2024-04-19T12:28:58Z 2024 Final Year Project (FYP) Jaiswal, A. (2024). Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175194 https://hdl.handle.net/10356/175194 en SCSE23-0158 application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Computer and Information Science
EEG
Brain
Taste
Visual cue
Congruent
XAI
Multisensory
spellingShingle Computer and Information Science
EEG
Brain
Taste
Visual cue
Congruent
XAI
Multisensory
Jaiswal, Arnav
Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues
description Does viewing food pictures invoke any taste-related response? Applications of the answer to this question lie in various fields such as neuromarketing. The intersection between neurotechnology and gastronomy, my research investigates using multimodal neural, physiological, and physical sensors to decode and classify taste perceptions with or without visual stimuli. This study aims to build a multi- stage experiment design with cognitive, smell, and taste profiling to evaluate how gastronomic visual stimuli are presented concurrently while administering different taste stimuli. Our experiment is trying to answer how EEG combined with other sensors improves the classification accuracy of the five basic tastes: sweet, sour, salty, bitter, and neutral in the presence of food or non-food images. One of our research hypotheses is "Does the congruency of the visual and taste stimuli increase decoding and classification performance among four tastes and control (neutral)? From this regard, I designed an experiment encompassing the above conditions and collected multimodal data in a controlled lab setting using standard taste testing kits with nine healthy subjects. From collected data, I analyze and evaluate both data-driven deep learning approaches and eXplainable Artificial Intelligence (XAI) with causal machine learning to better understand the intricate relationship and interplays between tastes, cognition, and behaviors. I believe this research investigation will reveal promising results in EEG taste decoding while XAI gives meaningful insights about discriminative EEG among tastes.
author2 Guan Cuntai
author_facet Guan Cuntai
Jaiswal, Arnav
format Final Year Project
author Jaiswal, Arnav
author_sort Jaiswal, Arnav
title Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues
title_short Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues
title_full Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues
title_fullStr Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues
title_full_unstemmed Gastronomic gaze: decoding EEG-based discriminative patterns in four basic tastes with visual cues
title_sort gastronomic gaze: decoding eeg-based discriminative patterns in four basic tastes with visual cues
publisher Nanyang Technological University
publishDate 2024
url https://hdl.handle.net/10356/175194
_version_ 1800916181367914496