File(s) under permanent embargo
From convex to nonconvex: A loss function analysis for binary classification
conference contribution
posted on 2010-12-01, 00:00 authored by L Zhao, Musa MammadovMusa Mammadov, John YearwoodJohn YearwoodProblems of data classification can be studied in the framework of regularization theory as ill-posed problems. In this framework, loss functions play an important role in the application of regularization theory to classification. In this paper, we review some important convex loss functions, including hinge loss, square loss, modified square loss, exponential loss, logistic regression loss, as well as some non-convex loss functions, such as sigmoid loss, ø-loss, ramp loss, normalized sigmoid loss, and the loss function of 2 layer neural network. Based on the analysis of these loss functions, we propose a new differentiable non-convex loss function, called smoothed 0-1 loss function, which is a natural approximation of the 0-1 loss function. To compare the performance of different loss functions, we propose two binary classification algorithms for binary classification, one for convex loss functions, the other for non-convex loss functions. A set of experiments are launched on several binary data sets from the UCI repository. The results show that the proposed smoothed 0-1 loss function is robust, especially for those noisy data sets with many outliers. © 2010 IEEE.
History
Pagination
1281-1288Location
Sydney, N.S.W.Publisher DOI
Start date
2010-12-13End date
2010-12-13ISSN
1550-4786ISBN-13
9780769542577Publication classification
EN.1 Other conference paperTitle of proceedings
Proceedings - IEEE International Conference on Data Mining, ICDMPublisher
IEEEPlace of publication
Piscataway, N.J.Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC