Layer-wise deep learning for object classifications
Although global backpropagation has become the mainstream training method for convolutional neural networks, there are still some inherent disadvantages, such as backward locking and memory reuse problems. Moreover, the neural network trained by the global backpropagation method is also regarded as...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/168048 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-168048 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1680482023-07-04T16:09:06Z Layer-wise deep learning for object classifications Xu, Lei Cheah Chien Chern School of Electrical and Electronic Engineering ECCCheah@ntu.edu.sg Engineering::Electrical and electronic engineering Although global backpropagation has become the mainstream training method for convolutional neural networks, there are still some inherent disadvantages, such as backward locking and memory reuse problems. Moreover, the neural network trained by the global backpropagation method is also regarded as a black box and hard to explain. In view of this, layer-wise learning attracted more attention as an alternative to the global backpropagation training approach recently. In this dissertation, we first applied the layer-wise learning method to the ResNet-18 model and then evaluated its performance on some common benchmark datasets. The experimental results proved a better convergence ability of the layer-wise learning method with the ResNet-18 network. The results also showed a reasonable trade-off between the performance and the number of parameters inside the network. Although the testing accuracy was slightly lower than the global backpropagation method, the layer-wise learning method indicated a structure with fewer layers to achieve reasonable accuracies. Moreover, it also shows the potential of employing the layer-wise learning method to determine the appropriate number of layers. Then, we changed the hierarchical structure of the original ResNet-18 model to improve its performance. The modified network was able to further decline in the amounts of parameters inside the network with a slightly lower performance with the global BP and SGD method and a similar or better performance with the original layer-wise learning method, according to the experiments. Keywords: Deep learning, Layer-Wise Learning, ResNet, CNNs, Separability. Master of Science (Computer Control and Automation) 2023-05-26T02:35:25Z 2023-05-26T02:35:25Z 2023 Thesis-Master by Coursework Xu, L. (2023). Layer-wise deep learning for object classifications. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/168048 https://hdl.handle.net/10356/168048 en application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering |
spellingShingle |
Engineering::Electrical and electronic engineering Xu, Lei Layer-wise deep learning for object classifications |
description |
Although global backpropagation has become the mainstream training method for convolutional neural networks, there are still some inherent disadvantages, such as backward locking and memory reuse problems. Moreover, the neural network trained by the global backpropagation method is also regarded as a black box and hard to explain. In view of this, layer-wise learning attracted more attention as an alternative to the global backpropagation training approach recently. In this dissertation, we first applied the layer-wise learning method to the ResNet-18 model and then evaluated its performance on some common benchmark datasets. The experimental results proved a better convergence ability of the layer-wise learning method with the ResNet-18 network. The results also showed a reasonable trade-off between the performance and the number of parameters inside the network. Although the testing accuracy was slightly lower than the global backpropagation method, the layer-wise learning method indicated a structure with fewer layers to achieve reasonable accuracies. Moreover, it also shows the potential of employing the layer-wise learning method to determine the appropriate number of layers. Then, we changed the hierarchical structure of the original ResNet-18 model to improve its performance. The modified network was able to further decline in the amounts of parameters inside the network with a slightly lower performance with the global BP and SGD method and a similar or better performance with the original layer-wise learning method, according to the experiments.
Keywords: Deep learning, Layer-Wise Learning, ResNet, CNNs, Separability. |
author2 |
Cheah Chien Chern |
author_facet |
Cheah Chien Chern Xu, Lei |
format |
Thesis-Master by Coursework |
author |
Xu, Lei |
author_sort |
Xu, Lei |
title |
Layer-wise deep learning for object classifications |
title_short |
Layer-wise deep learning for object classifications |
title_full |
Layer-wise deep learning for object classifications |
title_fullStr |
Layer-wise deep learning for object classifications |
title_full_unstemmed |
Layer-wise deep learning for object classifications |
title_sort |
layer-wise deep learning for object classifications |
publisher |
Nanyang Technological University |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/168048 |
_version_ |
1772827365380980736 |