Parametric flatten-t swish: an adaptive nonlinear activation function for deep learning

Activation function is a key component in deep learning that performs non-linear mappings between the inputs and outputs. Rectified Linear Unit (ReLU) has been the most popular activation function across the deep learning community. However, ReLU contains several shortcomings that can result in ine...

Full description

Saved in:
Bibliographic Details
Main Authors: Hock, Hung Chieng, Wahid, Noorhaniza, Ong, Pauline
Format: Article
Language:English
Published: Universiti Utara Malaysia 2021
Subjects:
Online Access:https://repo.uum.edu.my/id/eprint/28125/1/document%20%284%29.pdf
https://doi.org/10.32890/jict.20.1.2021.9267
https://repo.uum.edu.my/id/eprint/28125/
https://www.e-journal.uum.edu.my/index.php/jict/article/view/12398
https://doi.org/10.32890/jict.20.1.2021.9267
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Utara Malaysia
Language: English