Spiking neural network for object recognition

While artificial intelligence technology has made significant strides and found wideranging applications, there persists a demand for a more intricately designed AI system that emulates natural intelligence. One promising avenue is the utilization of Spiking Neural Networks (SNNs), constructed fr...

Full description

Saved in:
Bibliographic Details
Main Author: Li, Wei
Other Authors: Meng-Hiot Lim
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/172518
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:While artificial intelligence technology has made significant strides and found wideranging applications, there persists a demand for a more intricately designed AI system that emulates natural intelligence. One promising avenue is the utilization of Spiking Neural Networks (SNNs), constructed from spiking neurons to replicate the biologically plausible computations observed in the brain. Unlike most other networks that operate on a time-based paradigm, SNNs employ event-driven methodologies, and researchers have extensively investigated their performance. This dissertation centers on the application of SNNs in the domain of object recognition. It evaluates the performance of the Spatio-Temporal Backpropagation (STBP) method within a shallow network architecture on benchmark datasets including MNIST, CIFAR-10, and CIFAR-100 for image classification, comparing it with traditional Convolutional Neural Networks (CNNs). The method of batch normalization through time was employed to optimize the training process. Additionally, a deeper neural network was utilized to analyze the performance of the SNN, with the aim of improving phoneme recognition. Ultimately, this work culminates in a comprehensive review of recent advancements in SNNs, providing valuable insights into the current state of the field.