Neural network (NN) is a popular artificial intelligence technique for solving complicated problems due to their inherent capabilities. However generalization in NN can be harmed by a number of factors including parameter's initialization, inappropriate network topology and setting parameters of the training process itself. Forecast combinations of NN models have the potential for improved generalization and lower training time. A weighted averaging based on Variance-Covariance method that assigns greater weight to the forecasts producing lower error, instead of equal weights is practiced in this paper. While implementing the method, combination of forecasts is done with all candidate models in one experiment and with the best selected models in another experiment. It is observed during the empirical analysis that forecasting accuracy is improved by combining the best individual NN models. Another finding of this study is that reducing the number of NN models increases the diversity and, hence, accuracy.
History
Event
IEEE Systems, Man and Cybernetics. Conference (2013 : Manchester, England)
Pagination
3214 - 3219
Publisher
IEEE
Location
Manchester, England
Place of publication
Piscataway, N.J.
Start date
2013-10-13
End date
2013-10-16
Language
eng
Publication classification
E1 Full written paper - refereed
Copyright notice
2013, IEEE
Title of proceedings
SMC 2013 : Proceedings of the 2013 IEEE International Conference on Systems, Man and Cybernetics