Deakin University
Browse

File(s) under permanent embargo

Nonsmooth optimization algorithm for solving clusterwise linear regression problems

Version 2 2024-06-04, 13:50
Version 1 2018-08-24, 15:05
journal contribution
posted on 2024-06-04, 13:50 authored by AM Bagirov, Julien UgonJulien Ugon, HG Mirzayeva
Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Späth and the incremental algorithms on several publicly available datasets for regression analysis.

History

Journal

Journal of optimization theory and applications

Volume

164

Pagination

755-780

Location

Cham, Switzerland

ISSN

0022-3239

eISSN

1573-2878

Language

eng

Publication classification

C1.1 Refereed article in a scholarly journal

Copyright notice

2014, Springer Science+Business Media New York

Issue

3

Publisher

Springer