Deakin University
Browse

File(s) under permanent embargo

Artificial neural network: Deep or broad? An empirical study

conference contribution
posted on 2016-01-01, 00:00 authored by Nayyar ZaidiNayyar Zaidi, Nian Liu
Advent of Deep Learning and the emergence of Big Data has led to renewed interests in the study of Artificial Neural Networks (ANN). An ANN is a highly effective classifier that is capable of learning both linear and non-linear boundaries. The number of hidden layers and the number of nodes in each hidden layer (along with many other parameters) in an ANN, is considered to be a model selection problem. With success of deep learning especially on big datasets, there is a prevalent belief in machine learning community that a deep model (that is a model with many number of hidden layers) is preferable. However, this belies earlier theorems proved for ANN that only a single hidden layer (with multiple nodes) is capable of learning any arbitrary function, i.e., a shallow broad ANN. This raises the question of whether one should build a deep network or go for a broad network. In this paper, we do a systematic study of depth and breadth of an ANN in terms of its accuracy (0–1 Loss), bias, variance and convergence performance on 72 standard UCI datasets and we argue that broad ANN has better overall performance than deep ANN.

History

Event

Australasian Joint Conference on Artificial Intelligence (2016 : 29th : Hobart, Tasmania)

Volume

9992

Series

Lecture Notes in Computer Science

Pagination

535 - 541

Publisher

Springer

Location

Hobart, Tasmania

Place of publication

Berlin, Germany

Start date

2016-12-05

End date

2016-12-08

ISBN-13

9783319501277

Language

eng

Publication classification

E1.1 Full written paper - refereed

Editor/Contributor(s)

B Ho Kang, Quan Bai

Title of proceedings

AI 2016: Advances in Artificial Intelligence 29th Australasian Joint Conference, Hobart, TAS, Australia, December 5-8, 2016, Proceedings

Usage metrics

    Research Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC