Selection of the topology of a neural network and correct parameters for the learning algorithm is a tedious task for designing an optimal artificial neural network, which is smaller, faster and with a better generalization performance. In this paper we introduce a recently developed cutting angle method (a deterministic technique) for global optimization of connection weights. Neural networks are initially trained using the cutting angle method and later the learning is fine-tuned (meta-learning) using conventional gradient descent or other optimization techniques. Experiments were carried out on three time series benchmarks and a comparison was done using evolutionary neural networks. Our preliminary experimentation results show that the proposed deterministic approach could provide near optimal results much faster than the evolutionary approach.
History
Title of proceedings
Hybrid information systems
Event
International Workshop on Hybrid Intelligent Systems (2001 : Adelaide, S. Aust.)