An Online Unsupervised Structural Plasticity Algorithm for Spiking Neural Networks
In this paper, we propose a novel winner-take-all (WTA) architecture employing neurons with nonlinear dendrites and an online unsupervised structural plasticity rule for training it. Furthermore, to aid hardware implementations, our network employs only binary synapses. The proposed learning rule is...
Saved in:
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2016
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/80901 http://hdl.handle.net/10220/41059 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | In this paper, we propose a novel winner-take-all (WTA) architecture employing neurons with nonlinear dendrites and an online unsupervised structural plasticity rule for training it. Furthermore, to aid hardware implementations, our network employs only binary synapses. The proposed learning rule is inspired by spike-timing-dependent plasticity but differs for each dendrite based on its activation level. It trains the WTA network through formation and elimination of connections between inputs and synapses. To demonstrate the performance of the proposed network and learning rule, we employ it to solve two-class, four-class, and six-class classification of random Poisson spike time inputs. The results indicate that by proper tuning of the inhibitory time constant of the WTA, a tradeoff between specificity and sensitivity of the network can be achieved. We use the inhibitory time constant to set the number of subpatterns per pattern we want to detect. We show that while the percentages of successful trials are 92%, 88%, and 82% for two-class, four-class, and six-class classification when no pattern subdivisions are made, it increases to 100% when each pattern is subdivided into 5 or 10 subpatterns. However, the former scenario of no pattern subdivision is more jitter resilient than the later ones. |
---|