Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm

Extreme Learning Machines (ELM) learn fast and eliminate the tuning of input weights and biases. However, ELM does not guarantee the optimal setting of the weights and biases due to random input parameters initialization. Therefore, ELM suffers from instability of output, large network size, and deg...

Full description

Saved in:
Bibliographic Details
Main Authors: Alade, Oyekale Abel, Sallehuddin, Roselina, Mohamed Radzi, Nor Haizan
Format: Article
Published: Centre for Environment and Socio-Economic Research Publications 2022
Subjects:
Online Access:http://eprints.utm.my/id/eprint/98680/
http://www.ceser.in/ceserp/index.php/ijai/article/view/6857
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Universiti Teknologi Malaysia
id my.utm.98680
record_format eprints
spelling my.utm.986802023-01-30T04:54:46Z http://eprints.utm.my/id/eprint/98680/ Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm Alade, Oyekale Abel Sallehuddin, Roselina Mohamed Radzi, Nor Haizan QA75 Electronic computers. Computer science Extreme Learning Machines (ELM) learn fast and eliminate the tuning of input weights and biases. However, ELM does not guarantee the optimal setting of the weights and biases due to random input parameters initialization. Therefore, ELM suffers from instability of output, large network size, and degrade generalization performance. To overcome these problems, an efficient co-evolutionary hybrid model namely as Cross-Entropy Moth-Flame Optimization (CEMFO-ELM) model is proposed to train a neural network for the selection of optimal input weights and biases. The hybrid model balanced the exploration and exploitation of the search space, and then selected optimal input weights and biases for ELM. The co-evolutionary algorithm reduced the chances of been trapped into the local extremum in the search space. Accuracy, stability, and percentage improvement ratio (PIR%) were the metrics used to evaluate the performance of the proposed model when simulated on some classification datasets for machine learning from the University of California, Irvine repository. The co-evolutionary scheme was compared with its constituent individual ELM-based enhanced meta-heuristic schemes (CE-ELM and MFO-ELM). The co-evolutionary meta-heuristic algorithm enhances the selection of optimal parameters for ELM. It improves the accuracy of ELM in all the simulations, and the stability of ELM was improved in all, up to 53% in Breast cancer simulation. Also, it has better convergences than the comparative ELM hybrid model in all the simulations. Centre for Environment and Socio-Economic Research Publications 2022 Article PeerReviewed Alade, Oyekale Abel and Sallehuddin, Roselina and Mohamed Radzi, Nor Haizan (2022) Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm. International Journal of Artificial Intelligence, 20 (1). pp. 46-67. ISSN 0974-0635 http://www.ceser.in/ceserp/index.php/ijai/article/view/6857
institution Universiti Teknologi Malaysia
building UTM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Teknologi Malaysia
content_source UTM Institutional Repository
url_provider http://eprints.utm.my/
topic QA75 Electronic computers. Computer science
spellingShingle QA75 Electronic computers. Computer science
Alade, Oyekale Abel
Sallehuddin, Roselina
Mohamed Radzi, Nor Haizan
Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
description Extreme Learning Machines (ELM) learn fast and eliminate the tuning of input weights and biases. However, ELM does not guarantee the optimal setting of the weights and biases due to random input parameters initialization. Therefore, ELM suffers from instability of output, large network size, and degrade generalization performance. To overcome these problems, an efficient co-evolutionary hybrid model namely as Cross-Entropy Moth-Flame Optimization (CEMFO-ELM) model is proposed to train a neural network for the selection of optimal input weights and biases. The hybrid model balanced the exploration and exploitation of the search space, and then selected optimal input weights and biases for ELM. The co-evolutionary algorithm reduced the chances of been trapped into the local extremum in the search space. Accuracy, stability, and percentage improvement ratio (PIR%) were the metrics used to evaluate the performance of the proposed model when simulated on some classification datasets for machine learning from the University of California, Irvine repository. The co-evolutionary scheme was compared with its constituent individual ELM-based enhanced meta-heuristic schemes (CE-ELM and MFO-ELM). The co-evolutionary meta-heuristic algorithm enhances the selection of optimal parameters for ELM. It improves the accuracy of ELM in all the simulations, and the stability of ELM was improved in all, up to 53% in Breast cancer simulation. Also, it has better convergences than the comparative ELM hybrid model in all the simulations.
format Article
author Alade, Oyekale Abel
Sallehuddin, Roselina
Mohamed Radzi, Nor Haizan
author_facet Alade, Oyekale Abel
Sallehuddin, Roselina
Mohamed Radzi, Nor Haizan
author_sort Alade, Oyekale Abel
title Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
title_short Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
title_full Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
title_fullStr Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
title_full_unstemmed Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
title_sort enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
publisher Centre for Environment and Socio-Economic Research Publications
publishDate 2022
url http://eprints.utm.my/id/eprint/98680/
http://www.ceser.in/ceserp/index.php/ijai/article/view/6857
_version_ 1756684242890260480