SEFRON : a new spiking neuron model with time-varying synaptic efficacy function for pattern classification
This paper presents a new time-varying long-term Synaptic Efficacy Function-based leaky-integrate-and-fire neuRON model, referred to as SEFRON and its supervised learning rule for pattern classification problems. The time-varying synaptic efficacy function is represented by a sum of amplitude modula...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/144620 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | This paper presents a new time-varying long-term Synaptic Efficacy Function-based leaky-integrate-and-fire neuRON model, referred to as SEFRON and its supervised learning rule for pattern classification problems. The time-varying synaptic efficacy function is represented by a sum of amplitude modulated Gaussian distribution functions located at different times. For a given pattern, the SEFRON's learning rule determines the changes in the amplitudes of weights at selected presynaptic spike times by minimizing a new error function reflecting the differences between the desired and actual postsynaptic firing times. Similar to the gamma-aminobutyric acid-switch phenomenon observed in a biological neuron that switches between excitatory and inhibitory postsynaptic potentials based on the physiological needs, the time-varying synapse model proposed in this paper allows the synaptic efficacy (weight) to switch signs in a continuous manner. The computational power and the functioning of SEFRON are first illustrated using a binary pattern classification problem. The detailed performance comparisons of a single SEFRON classifier with other spiking neural networks (SNNs) are also presented using four benchmark data sets from the UCI machine learning repository. The results clearly indicate that a single SEFRON provides a similar generalization performance compared to other SNNs with multiple layers and multiple neurons. |
---|