Variance-covariance based weighing for neural network ensembles
Neural network (NN) is a popular artificial intelligence technique for solving complicated problems due to their inherent capabilities. However generalization in NN can be harmed by a number of factors including parameter's initialization, inappropriate network topology and setting parameters o...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference or Workshop Item |
Published: |
2013
|
Online Access: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-84893616999&doi=10.1109%2fSMC.2013.548&partnerID=40&md5=aa07a4a882d3d2060e6e4f3b254b3dbb http://eprints.utp.edu.my/32669/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Teknologi Petronas |
Summary: | Neural network (NN) is a popular artificial intelligence technique for solving complicated problems due to their inherent capabilities. However generalization in NN can be harmed by a number of factors including parameter's initialization, inappropriate network topology and setting parameters of the training process itself. Forecast combinations of NN models have the potential for improved generalization and lower training time. A weighted averaging based on Variance-Covariance method that assigns greater weight to the forecasts producing lower error, instead of equal weights is practiced in this paper. While implementing the method, combination of forecasts is done with all candidate models in one experiment and with the best selected models in another experiment. It is observed during the empirical analysis that forecasting accuracy is improved by combining the best individual NN models. Another finding of this study is that reducing the number of NN models increases the diversity and, hence, accuracy. © 2013 IEEE. |
---|