Fully decoupled neural network learning using delayed gradients

Training neural networks with back-propagation (BP) requires a sequential passing of activations and gradients. This has been recognized as the lockings (i.e., the forward, backward, and update lockings) among modules (each module contains a stack of layers) inherited from the BP. In this paper,...

全面介紹

Saved in:
書目詳細資料
Main Authors: Zhuang, Huiping, Wang, Yi, Liu, Qinglai, Lin, Zhiping
其他作者: School of Electrical and Electronic Engineering
格式: Article
語言:English
出版: 2024
主題:
在線閱讀:https://hdl.handle.net/10356/174476
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!