Neural modeling of episodic memory: Encoding, retrieval, and forgetting

This paper presents a neural model that learns episodic traces in response to a continuous stream of sensory input and feedback received from the environment. The proposed model, based on fusion Adaptive Resonance Theory (fusion ART) network, extracts key events and encodes spatio-temporal relations...

Full description

Saved in:
Bibliographic Details
Main Authors: WANG, Wenwen, SUBAGDJA, Budhitama, TAN, Ah-hwee, STARZYK, Janusz A.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2012
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/5202
https://ink.library.smu.edu.sg/context/sis_research/article/6205/viewcontent/Neural_Modeling_of_Episodic_Memory___TNNLS_2012_Preprint.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-6205
record_format dspace
spelling sg-smu-ink.sis_research-62052020-07-23T18:43:35Z Neural modeling of episodic memory: Encoding, retrieval, and forgetting WANG, Wenwen SUBAGDJA, Budhitama TAN, Ah-hwee STARZYK, Janusz A. This paper presents a neural model that learns episodic traces in response to a continuous stream of sensory input and feedback received from the environment. The proposed model, based on fusion Adaptive Resonance Theory (fusion ART) network, extracts key events and encodes spatio-temporal relations between events by creating cognitive nodes dynamically. The model further incorporates a novel memory search procedure, which performs parallel search of stored episodic traces continuously. Combined with a mechanism of gradual forgetting, the model is able to achieve a high level of memory performance and robustness, while controlling memory consumption over time. We present experimental studies, where the proposed episodic memory model is evaluated based on the memory consumption for encoding events and episodes as well as recall accuracy using partial and erroneous cues. Our experimental results show that: (1) The model produces highly robust performance in encoding and recalling events and episodes even with incomplete and noisy cues; (2) The model provides an enhanced performance in noisy environment due to the process of forgetting; (3) Compared with prior models of spatio-temporal memory, our model shows a higher tolerance towards noise and errors in the retrieval cues. 2012-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/5202 info:doi/10.1109/TNNLS.2012.2208477 https://ink.library.smu.edu.sg/context/sis_research/article/6205/viewcontent/Neural_Modeling_of_Episodic_Memory___TNNLS_2012_Preprint.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University episodic memory agent ART based network hierarchical structure memory robustness forgetting Unreal Tournament Databases and Information Systems Programming Languages and Compilers Software Engineering
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic episodic memory
agent
ART based network
hierarchical structure
memory robustness
forgetting
Unreal Tournament
Databases and Information Systems
Programming Languages and Compilers
Software Engineering
spellingShingle episodic memory
agent
ART based network
hierarchical structure
memory robustness
forgetting
Unreal Tournament
Databases and Information Systems
Programming Languages and Compilers
Software Engineering
WANG, Wenwen
SUBAGDJA, Budhitama
TAN, Ah-hwee
STARZYK, Janusz A.
Neural modeling of episodic memory: Encoding, retrieval, and forgetting
description This paper presents a neural model that learns episodic traces in response to a continuous stream of sensory input and feedback received from the environment. The proposed model, based on fusion Adaptive Resonance Theory (fusion ART) network, extracts key events and encodes spatio-temporal relations between events by creating cognitive nodes dynamically. The model further incorporates a novel memory search procedure, which performs parallel search of stored episodic traces continuously. Combined with a mechanism of gradual forgetting, the model is able to achieve a high level of memory performance and robustness, while controlling memory consumption over time. We present experimental studies, where the proposed episodic memory model is evaluated based on the memory consumption for encoding events and episodes as well as recall accuracy using partial and erroneous cues. Our experimental results show that: (1) The model produces highly robust performance in encoding and recalling events and episodes even with incomplete and noisy cues; (2) The model provides an enhanced performance in noisy environment due to the process of forgetting; (3) Compared with prior models of spatio-temporal memory, our model shows a higher tolerance towards noise and errors in the retrieval cues.
format text
author WANG, Wenwen
SUBAGDJA, Budhitama
TAN, Ah-hwee
STARZYK, Janusz A.
author_facet WANG, Wenwen
SUBAGDJA, Budhitama
TAN, Ah-hwee
STARZYK, Janusz A.
author_sort WANG, Wenwen
title Neural modeling of episodic memory: Encoding, retrieval, and forgetting
title_short Neural modeling of episodic memory: Encoding, retrieval, and forgetting
title_full Neural modeling of episodic memory: Encoding, retrieval, and forgetting
title_fullStr Neural modeling of episodic memory: Encoding, retrieval, and forgetting
title_full_unstemmed Neural modeling of episodic memory: Encoding, retrieval, and forgetting
title_sort neural modeling of episodic memory: encoding, retrieval, and forgetting
publisher Institutional Knowledge at Singapore Management University
publishDate 2012
url https://ink.library.smu.edu.sg/sis_research/5202
https://ink.library.smu.edu.sg/context/sis_research/article/6205/viewcontent/Neural_Modeling_of_Episodic_Memory___TNNLS_2012_Preprint.pdf
_version_ 1770575330951561216