The functional link neural network (FLNN) increases the input dimension by functionally expanding the input features. In this paper, modifications to the FLNN are proposed for undertaking data classification tasks. The main objective is to optimize the FLNN by formulating a parsimonious network with less complexity and lower computational burden as compared with the original FLNN. The methodology consists of selecting a number of important expanded features to build the FLNN structure. It is based on the rationale that not all the expanded features are equally important in distinguishing different target classes. As such, we modify the FLNN in a way that less—relevant and redundant expanded input features are identified and discarded. In addition, instead of using the back-propagation learning algorithm, adjustment of the network weights is formulated as an optimisation task. Specifically, the genetic algorithm is used for both feature selection as well as weight tuning in the FLNN. An experimental study using benchmark problems is conducted to evaluate the efficacy of the modified FLNN. The empirical results indicate that even though the structure of the modified FLNN is simpler, it is able to achieve comparable classification results as those from the original FLNN with fully expanded input features.
History
Chapter number
13
Pagination
229-244
ISSN
2196-8861
ISBN-13
978-981-10-3955-3
Language
English
Publication classification
B Book chapter, B1 Book chapter
Copyright notice
2017, Springer Nature Singapore Pte Ltd.
Extent
15
Editor/Contributor(s)
Bhatti A, Lee KH, Garmestani H, Lim CP
Publisher
Springer Nature
Place of publication
Singapore
Title of book
Emerging trends in neuro engineering and neural computation