File(s) under permanent embargo
Preconditioning an artificial neural network using naive bayes
conference contributionposted on 2016-01-01, 00:00 authored by Nayyar ZaidiNayyar Zaidi, Francois Petitjean, Geoff Webb
Logistic Regression (LR) is a workhorse of the statistics community and a state-of-the-art machine learning classifier. It learns a linear model from inputs to outputs trained by optimizing the Conditional Log-Likelihood (CLL) of the data. Recently, it has been shown that preconditioning LR using a Naive Bayes (NB) model speeds up LR learning many-fold. One can, however, train a linear model by optimizing the mean-square-error (MSE) instead of CLL. This leads to an Artificial Neural Network (ANN) with no hidden layer. In this work, we study the effect of NB preconditioning on such an ANN classifier. Optimizing MSE instead of CLL may lead to a lower bias classifier and hence result in better performance on big datasets. We show that this NB preconditioning can speed-up convergence significantly. We also show that optimizing a linear model with MSE leads to a lower bias classifier than optimizing with CLL. We also compare the performance to state-of-the-art classifier Random Forest.
EventPacific-Asia Conference on Knowledge Discovery and Data Mining (20th : 2016 : Auckland, New Zealand)
SeriesLecture Notes in Computer Science
Pagination341 - 353
LocationAuckland, New Zealand
Place of publicationBerlin, Germany
Publication classificationE1.1 Full written paper - refereed
Title of proceedingsPAKDD 2016 : Advances in knowledge discovery and data mining : 20th Pacific-Asia Conference, PAKDD 2016, Auckland, New Zealand, April 19-22, 2016, Proceedings.
Logistic regressionPreconditioningConditional loglikelihoodMean-square-errorWANBIA-CArtificial neural networksScience & TechnologyTechnologyComputer Science, Artificial IntelligenceComputer Science, Information SystemsComputer Science, Theory & MethodsComputer ScienceConditional log-likelihoodALGORITHMArtificial Intelligence and Image Processing