Deakin University
Browse

File(s) under permanent embargo

Preconditioning an artificial neural network using naive bayes

conference contribution
posted on 2016-01-01, 00:00 authored by Nayyar ZaidiNayyar Zaidi, Francois Petitjean, Geoff Webb
Logistic Regression (LR) is a workhorse of the statistics community and a state-of-the-art machine learning classifier. It learns a linear model from inputs to outputs trained by optimizing the Conditional Log-Likelihood (CLL) of the data. Recently, it has been shown that preconditioning LR using a Naive Bayes (NB) model speeds up LR learning many-fold. One can, however, train a linear model by optimizing the mean-square-error (MSE) instead of CLL. This leads to an Artificial Neural Network (ANN) with no hidden layer. In this work, we study the effect of NB preconditioning on such an ANN classifier. Optimizing MSE instead of CLL may lead to a lower bias classifier and hence result in better performance on big datasets. We show that this NB preconditioning can speed-up convergence significantly. We also show that optimizing a linear model with MSE leads to a lower bias classifier than optimizing with CLL. We also compare the performance to state-of-the-art classifier Random Forest.

History

Event

Pacific-Asia Conference on Knowledge Discovery and Data Mining (20th : 2016 : Auckland, New Zealand)

Volume

9651

Series

Lecture Notes in Computer Science

Pagination

341 - 353

Publisher

Springer

Location

Auckland, New Zealand

Place of publication

Berlin, Germany

Start date

2016-04-19

End date

2016-04-22

ISBN-13

9783319317533

ISBN-10

3319317539

Language

eng

Publication classification

E1.1 Full written paper - refereed

Editor/Contributor(s)

James Bailey

Title of proceedings

PAKDD 2016 : Advances in knowledge discovery and data mining : 20th Pacific-Asia Conference, PAKDD 2016, Auckland, New Zealand, April 19-22, 2016, Proceedings.