Enhancing extreme learning machines using cross-entropy moth-flame optimization algorithm
Extreme Learning Machines (ELM) learn fast and eliminate the tuning of input weights and biases. However, ELM does not guarantee the optimal setting of the weights and biases due to random input parameters initialization. Therefore, ELM suffers from instability of output, large network size, and deg...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Published: |
Centre for Environment and Socio-Economic Research Publications
2022
|
Subjects: | |
Online Access: | http://eprints.utm.my/id/eprint/98680/ http://www.ceser.in/ceserp/index.php/ijai/article/view/6857 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Teknologi Malaysia |
Summary: | Extreme Learning Machines (ELM) learn fast and eliminate the tuning of input weights and biases. However, ELM does not guarantee the optimal setting of the weights and biases due to random input parameters initialization. Therefore, ELM suffers from instability of output, large network size, and degrade generalization performance. To overcome these problems, an efficient co-evolutionary hybrid model namely as Cross-Entropy Moth-Flame Optimization (CEMFO-ELM) model is proposed to train a neural network for the selection of optimal input weights and biases. The hybrid model balanced the exploration and exploitation of the search space, and then selected optimal input weights and biases for ELM. The co-evolutionary algorithm reduced the chances of been trapped into the local extremum in the search space. Accuracy, stability, and percentage improvement ratio (PIR%) were the metrics used to evaluate the performance of the proposed model when simulated on some classification datasets for machine learning from the University of California, Irvine repository. The co-evolutionary scheme was compared with its constituent individual ELM-based enhanced meta-heuristic schemes (CE-ELM and MFO-ELM). The co-evolutionary meta-heuristic algorithm enhances the selection of optimal parameters for ELM. It improves the accuracy of ELM in all the simulations, and the stability of ELM was improved in all, up to 53% in Breast cancer simulation. Also, it has better convergences than the comparative ELM hybrid model in all the simulations. |
---|