Deakin University
Browse

File(s) under permanent embargo

Derivative-free optimization and neural networks for robust regression

journal contribution
posted on 2012-01-01, 00:00 authored by Gleb BeliakovGleb Beliakov, Andrei Kelarev, John YearwoodJohn Yearwood
Large outliers break down linear and nonlinear regression models. Robust regression methods allow one to filter out the outliers when building a model. By replacing the traditional least squares criterion with the least trimmed squares (LTS) criterion, in which half of data is treated as potential outliers, one can fit accurate regression models to strongly contaminated data. High-breakdown methods have become very well established in linear regression, but have started being applied for non-linear regression only recently. In this work, we examine the problem of fitting artificial neural networks (ANNs) to contaminated data using LTS criterion. We introduce a penalized LTS criterion which prevents unnecessary removal of valid data. Training of ANNs leads to a challenging non-smooth global optimization problem. We compare the efficiency of several derivative-free optimization methods in solving it, and show that our approach identifies the outliers correctly when ANNs are used for nonlinear regression.

History

Journal

Optimization

Volume

61

Issue

12

Pagination

1467 - 1490

Publisher

Taylor & Francis

Location

Abingdon, England

ISSN

0233-1934

eISSN

1029-4945

Language

eng

Publication classification

C1 Refereed article in a scholarly journal

Copyright notice

2012, Taylor & Francis