Auxiliary network design for local learning in deep neural networks

The training of deep neural networks utilizes the backpropagation algorithm which consists of the forward pass, backward pass and parameters update. The output of a certain layer is produced based on the output of its lower layers in a sequential manner, and the gradients can only flow back layer by...

Full description

Saved in:
Bibliographic Details
Main Author: Peng, Jiawei
Other Authors: Lin Zhiping
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/149869
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-149869
record_format dspace
spelling sg-ntu-dr.10356-1498692023-07-07T18:29:02Z Auxiliary network design for local learning in deep neural networks Peng, Jiawei Lin Zhiping School of Electrical and Electronic Engineering EZPLin@ntu.edu.sg Engineering::Electrical and electronic engineering The training of deep neural networks utilizes the backpropagation algorithm which consists of the forward pass, backward pass and parameters update. The output of a certain layer is produced based on the output of its lower layers in a sequential manner, and the gradients can only flow back layer by layer. This forces the majority of the network to be idle during the training process and hence leads to inefficiency. This is recognized as forward, backward and update lockings. To break the lockings, various methods of decoupled learning have been investigated. Currently, these methods either lead to significant drop in accuracy performance or suffer from dramatic increase in memory usage. To remove these limitations, in this Final Year Project, a new form of decoupled learning, named decoupled neural network training scheme with re-computation and weight prediction (DTRP) is proposed. The proposed method splits a neural network into several modules and trains them synchronously on different workers. In particular, re-computation is adopted to solve the memory explosion problem. A weight prediction scheme is proposed to deal with the weight delay caused by re-computation. To execute weight prediction, several weight predictors are proposed. A batch compensation scheme is also explored which allows the proposed DTRP to run faster. Experiments are conducted on various Convolutional Neural Networks to perform image classification task, which shows comparable or better results against the state-of-art methods and the backpropagation. The experiments also reveal that the memory explosion problem is effectively solved, and a significant acceleration is achieved. Moreover, experiments show that the proposed DTRP can be applied to train very wide networks as well as extremely deep networks. Bachelor of Engineering (Electrical and Electronic Engineering) 2021-06-09T09:30:57Z 2021-06-09T09:30:57Z 2021 Final Year Project (FYP) Peng, J. (2021). Auxiliary network design for local learning in deep neural networks. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/149869 https://hdl.handle.net/10356/149869 en A3135-201 application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Peng, Jiawei
Auxiliary network design for local learning in deep neural networks
description The training of deep neural networks utilizes the backpropagation algorithm which consists of the forward pass, backward pass and parameters update. The output of a certain layer is produced based on the output of its lower layers in a sequential manner, and the gradients can only flow back layer by layer. This forces the majority of the network to be idle during the training process and hence leads to inefficiency. This is recognized as forward, backward and update lockings. To break the lockings, various methods of decoupled learning have been investigated. Currently, these methods either lead to significant drop in accuracy performance or suffer from dramatic increase in memory usage. To remove these limitations, in this Final Year Project, a new form of decoupled learning, named decoupled neural network training scheme with re-computation and weight prediction (DTRP) is proposed. The proposed method splits a neural network into several modules and trains them synchronously on different workers. In particular, re-computation is adopted to solve the memory explosion problem. A weight prediction scheme is proposed to deal with the weight delay caused by re-computation. To execute weight prediction, several weight predictors are proposed. A batch compensation scheme is also explored which allows the proposed DTRP to run faster. Experiments are conducted on various Convolutional Neural Networks to perform image classification task, which shows comparable or better results against the state-of-art methods and the backpropagation. The experiments also reveal that the memory explosion problem is effectively solved, and a significant acceleration is achieved. Moreover, experiments show that the proposed DTRP can be applied to train very wide networks as well as extremely deep networks.
author2 Lin Zhiping
author_facet Lin Zhiping
Peng, Jiawei
format Final Year Project
author Peng, Jiawei
author_sort Peng, Jiawei
title Auxiliary network design for local learning in deep neural networks
title_short Auxiliary network design for local learning in deep neural networks
title_full Auxiliary network design for local learning in deep neural networks
title_fullStr Auxiliary network design for local learning in deep neural networks
title_full_unstemmed Auxiliary network design for local learning in deep neural networks
title_sort auxiliary network design for local learning in deep neural networks
publisher Nanyang Technological University
publishDate 2021
url https://hdl.handle.net/10356/149869
_version_ 1772828411243266048