A spiking neural network for energy efficient image recognition in internet-of-things application
Nowadays, most of the neuron models used in artificial neural networks (such as ReLU) are second-generation neuron models which are mainly used to deal with numerical simulations. The third-generation neuron model used in spiking neural networks is inspired by biological authenticity which can proce...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/77497 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-77497 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-774972023-07-07T17:32:02Z A spiking neural network for energy efficient image recognition in internet-of-things application Chen, ShenShaoJu Goh Wang Ling School of Electrical and Electronic Engineering A*STAR Institute of Microelectronics Chen Yi DRNTU::Engineering::Electrical and electronic engineering Nowadays, most of the neuron models used in artificial neural networks (such as ReLU) are second-generation neuron models which are mainly used to deal with numerical simulations. The third-generation neuron model used in spiking neural networks is inspired by biological authenticity which can process signals based on impulsive processing. The spiking network has high computational efficiency and better bio-authenticity as compared to tradition neural networks. However, there is no suitable training methods to empower its performance, such that its error rate is lower than other famous neural networks in the market. The purpose of this project is to define a method to convert a pre-trained convolutional neural network to a Spiking neural network. The convolution neural network is written in python language with Pytorch and Keras framework and trained offline. The conversion method will be identified through studying literature review. In the literature review, a comprehensive discussion regarding non-spiking and spiking neural network architecture will be conducted. The aim of this discussion is to identify the difference between two networks and current challenges faced in converting a non-spiking network to spiking network. This will help to select the best method in the current research field to develop a SNN. Simulation will be carried out to examine the performance of the neural network before and after the conversion. From the simulation result, SNN uses less operations as compared to the original CNN in achieving similar performance. This highlights the potential of SNN when applied on energy efficient image recognition in Internet-of-Things application. Bachelor of Engineering (Electrical and Electronic Engineering) 2019-05-30T02:20:49Z 2019-05-30T02:20:49Z 2019 Final Year Project (FYP) http://hdl.handle.net/10356/77497 en Nanyang Technological University 66 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Electrical and electronic engineering |
spellingShingle |
DRNTU::Engineering::Electrical and electronic engineering Chen, ShenShaoJu A spiking neural network for energy efficient image recognition in internet-of-things application |
description |
Nowadays, most of the neuron models used in artificial neural networks (such as ReLU) are second-generation neuron models which are mainly used to deal with numerical simulations. The third-generation neuron model used in spiking neural networks is inspired by biological authenticity which can process signals based on impulsive processing. The spiking network has high computational efficiency and better bio-authenticity as compared to tradition neural networks. However, there is no suitable training methods to empower its performance, such that its error rate is lower than other famous neural networks in the market. The purpose of this project is to define a method to convert a pre-trained convolutional neural network to a Spiking neural network. The convolution neural network is written in python language with Pytorch and Keras framework and trained offline. The conversion method will be identified through studying literature review. In the literature review, a comprehensive discussion regarding non-spiking and spiking neural network architecture will be conducted. The aim of this discussion is to identify the difference between two networks and current challenges faced in converting a non-spiking network to spiking network. This will help to select the best method in the current research field to develop a SNN. Simulation will be carried out to examine the performance of the neural network before and after the conversion. From the simulation result, SNN uses less operations as compared to the original CNN in achieving similar performance. This highlights the potential of SNN when applied on energy efficient image recognition in Internet-of-Things application. |
author2 |
Goh Wang Ling |
author_facet |
Goh Wang Ling Chen, ShenShaoJu |
format |
Final Year Project |
author |
Chen, ShenShaoJu |
author_sort |
Chen, ShenShaoJu |
title |
A spiking neural network for energy efficient image recognition in internet-of-things application |
title_short |
A spiking neural network for energy efficient image recognition in internet-of-things application |
title_full |
A spiking neural network for energy efficient image recognition in internet-of-things application |
title_fullStr |
A spiking neural network for energy efficient image recognition in internet-of-things application |
title_full_unstemmed |
A spiking neural network for energy efficient image recognition in internet-of-things application |
title_sort |
spiking neural network for energy efficient image recognition in internet-of-things application |
publishDate |
2019 |
url |
http://hdl.handle.net/10356/77497 |
_version_ |
1772825448871362560 |