COMPARATIVE STUDY OF SURROGATE TECHNIQUES FOR HYPERPARAMETER OPTIMIZATION IN CONVOLUTIONAL NEURAL NETWORK
Optimizing hyperparameters in CNN is tedious for many researchers and practitioners. it requires a high degree of expertise or a lot of experience to optimize the hyperparameter and such manual optimization is likely to be biased. Hyperparameters in deep learning can be divided into two types which...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2023
|
Subjects: | |
Online Access: | http://utpedia.utp.edu.my/id/eprint/24632/1/NurshazlynMohdAszemi_17007352.pdf http://utpedia.utp.edu.my/id/eprint/24632/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Teknologi Petronas |
Language: | English |
Summary: | Optimizing hyperparameters in CNN is tedious for many researchers and practitioners. it requires a high degree of expertise or a lot of experience to optimize the hyperparameter and such manual optimization is likely to be biased. Hyperparameters in deep learning can be divided into two types which is those associated with the learning algorithms, such as determining what learning rate is appropriate, after how many iterations or epochs for each training and the other type of hyperparameter is related to how we design deep neural networks. For example, how many layers we need for our network, how many filters in given convolutional layers needs, etc. Choosing different values and setting these hyperparameters correctly is often critical for reaching the full potential of the deep neural network chosen or designed, consequently influencing the quality of the produced results. Currently, different methods or approaches have been introduced in mitigating the issues of manual optimization. |
---|