File(s) under permanent embargo
Derivative-free optimization and neural networks for robust regression
journal contribution
posted on 2012-01-01, 00:00 authored by Gleb BeliakovGleb Beliakov, Andrei Kelarev, John YearwoodJohn YearwoodLarge outliers break down linear and nonlinear regression models. Robust regression methods allow one to filter out the outliers when building a model. By replacing the traditional least squares criterion with the least trimmed squares (LTS) criterion, in which half of data is treated as potential outliers, one can fit accurate regression models to strongly contaminated data. High-breakdown methods have become very well established in linear regression, but have started being applied for non-linear regression only recently. In this work, we examine the problem of fitting artificial neural networks (ANNs) to contaminated data using LTS criterion. We introduce a penalized LTS criterion which prevents unnecessary removal of valid data. Training of ANNs leads to a challenging non-smooth global optimization problem. We compare the efficiency of several derivative-free optimization methods in solving it, and show that our approach identifies the outliers correctly when ANNs are used for nonlinear regression.
History
Journal
OptimizationVolume
61Issue
12Pagination
1467 - 1490Publisher
Taylor & FrancisLocation
Abingdon, EnglandPublisher DOI
ISSN
0233-1934eISSN
1029-4945Language
engPublication classification
C1 Refereed article in a scholarly journalCopyright notice
2012, Taylor & FrancisUsage metrics
Categories
No categories selectedKeywords
global optimizationleast-trimmed squaresneural networksnon-smooth optimizationrobust regressionScience & TechnologyTechnologyPhysical SciencesOperations Research & Management ScienceMathematics, AppliedMathematicsFEEDFORWARD NETWORKSOUTLIER DETECTIONALGORITHMEFFICIENTBREAKDOWNAPPROXIMATIONESTIMATORSSYSTEMS
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC