Nonsmooth nonconvex optimization approach to clusterwise linear regression problems
Version 2 2024-06-04, 13:50Version 2 2024-06-04, 13:50
Version 1 2018-08-24, 14:32Version 1 2018-08-24, 14:32
journal contribution
posted on 2013-08-16, 00:00 authored by A M Bagirov, Julien UgonJulien Ugon, H MirzayevaClusterwise regression consists of finding a number of regression functions each approximating a subset of the data. In this paper, a new approach for solving the clusterwise linear regression problems is proposed based on a nonsmooth nonconvex formulation. We present an algorithm for minimizing this nonsmooth nonconvex function. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate a good starting point for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or near global solution to the problem when the data sets are sufficiently dense. The algorithm is compared with the multistart Späth algorithm on several publicly available data sets for regression analysis. © 2013 Elsevier B.V. All rights reserved.
History
Journal
European journal of operational researchVolume
229Issue
1Pagination
132 - 142Publisher
ElsevierLocation
Amsterdam, The NetherlandsPublisher DOI
ISSN
0377-2217Language
engPublication classification
C1.1 Refereed article in a scholarly journalCopyright notice
2013, Elsevier B.V.Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC