Automated-tuned hyper-parameter deep neural network by using arithmetic optimization algorithm for Lorenz chaotic system

Deep neural networks (DNNs) are very dependent on their parameterization and require experts to determine which method to implement and modify the hyper-parameters value. This study proposes an automated-tuned hyper�parameter for DNN using a metaheuristic optimization algorithm, arithmetic optimizat...

Full description

Saved in:
Bibliographic Details
Main Authors: Ayop Azmi, Nurnajmin Qasrina Ann, Pebrianti, Dwi, Abas, Mohammad Fadhil, Bayuaji, Luhur
Format: Article
Language:English
English
Published: Institute of Advanced Engineering and Science (IAES) 2023
Subjects:
Online Access:http://irep.iium.edu.my/101897/1/automated%20hyper%20parameter%20tuning.pdf
http://irep.iium.edu.my/101897/7/Scopus%20-%20Automated-tuned%20hyper-parameter.pdf
http://irep.iium.edu.my/101897/
https://ijece.iaescore.com/index.php/IJECE/article/view/28393/16425
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Islam Antarabangsa Malaysia
Language: English
English
Description
Summary:Deep neural networks (DNNs) are very dependent on their parameterization and require experts to determine which method to implement and modify the hyper-parameters value. This study proposes an automated-tuned hyper�parameter for DNN using a metaheuristic optimization algorithm, arithmetic optimization algorithm (AOA). AOA makes use of the distribution properties of mathematics’ primary arithmetic operators, including multiplication, division, addition, and subtraction. AOA is mathematically modeled and implemented to optimize processes across a broad range of search spaces. The performance of AOA is evaluated against 29 benchmark functions, and several real-world engineering design problems are to demonstrate AOA’s applicability. The hyper-parameter tuning framework consists of a set of Lorenz chaotic system datasets, hybrid DNN architecture, and AOA that works automatically. As a result, AOA produced the highest accuracy in the test dataset with a combination of optimized hyper-parameters for DNN architecture. The boxplot analysis also produced the ten AOA particles that are the most accurately chosen. Hence, AOA with ten particles had the smallest size of boxplot for all hyper-parameters, which concluded the best solution. In particular, the result for the proposed system is outperformed compared to the architecture tested with particle swarm optimization.