Efficient implementation of activation functions for LSTM accelerators

Activation functions such as hyperbolic tangent (tanh) and logistic sigmoid (sigmoid) are critical computing elements in a long short term memory (LSTM) cell and network. These activation functions are non-linear, leading to challenges in their hardware implementations. Area-efficient and high perf...

全面介紹

Saved in:
書目詳細資料
Main Authors: Chong, Yi Sheng, Goh, Wang Ling, Ong, Yew-Soon, Nambiar, Vishnu P., Do, Anh Tuan
其他作者: School of Electrical and Electronic Engineering
格式: Conference or Workshop Item
語言:English
出版: 2021
主題:
在線閱讀:https://hdl.handle.net/10356/153121
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:Activation functions such as hyperbolic tangent (tanh) and logistic sigmoid (sigmoid) are critical computing elements in a long short term memory (LSTM) cell and network. These activation functions are non-linear, leading to challenges in their hardware implementations. Area-efficient and high performance hardware implementation of these activation functions thus becomes crucial to allow high throughput in a LSTM accelerator. In this work, we propose an approximation scheme which is suitable for both tanh and sigmoid functions. The proposed hardware for sigmoid function is 8.3 times smaller than the state-of-the-art, while for tanh function, it is the second smallest design. When applying the approximated tanh and sigmoid of 2% error in a LSTM cell computation, its final hidden state and cell state record errors of 3.1% and 5.8% respectively. When the same approximated functions are applied to a single layer LSTM network of 64 hidden nodes, the accuracy drops by 2.8% only. This proposed small yet accurate activation function hardware is promising to be used in Internet of Things (IoT) applications where accuracy can be traded off for ultra-low power consumption.