Improving Convolutional Neural Network (CNN) architecture (miniVGGNet) with batch normalization and learning rate decay factor for image classification
The image classification is a classical problem of image processing, computer vision, and machine learning. This paper presents an analysis of the performance using Convolutional Neural Network (CNN) for image classifying using deep learning. MiniVGGNet is CNN architecture used in this paper to trai...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
UTHM Publisher
2019
|
Online Access: | http://psasir.upm.edu.my/id/eprint/80197/1/Improving%20Convolutional%20Neural%20Network%20%28CNN%29%20architecture%20%28miniVGGNet%29%20with%20Batch%20Normalization%20and%20Learning%20Rate%20Decay%20Factor%20for%20Image%20Classification.pdf http://psasir.upm.edu.my/id/eprint/80197/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Putra Malaysia |
Language: | English |
Summary: | The image classification is a classical problem of image processing, computer vision, and machine learning. This paper presents an analysis of the performance using Convolutional Neural Network (CNN) for image classifying using deep learning. MiniVGGNet is CNN architecture used in this paper to train a network for image classification, and CIFAR-10 is selected dataset used for this purpose. The performance of the network was improved by hyper parameter tuning techniques using batch normalization and learning rate decay factor. This paper compares the performance of the trained network by adding batch normalization layer and adjusting the value of learning rate decay factor for the network architecture. Based on the experimental results, adding batch normalization layer allow the networks to improve classification accuracy from 80% to 82%. Applying learning rate decay factor will improve classification accuracy to 83% and reduce the effects of overfitting in learning plot. Performance analysis shows that applying hyper parameter tuning can improve the performance of the network and increasing the ability of the model to generalize. |
---|