Deakin University
Browse

File(s) under permanent embargo

From convex to nonconvex: A loss function analysis for binary classification

conference contribution
posted on 2010-12-01, 00:00 authored by L Zhao, Musa MammadovMusa Mammadov, John YearwoodJohn Yearwood
Problems of data classification can be studied in the framework of regularization theory as ill-posed problems. In this framework, loss functions play an important role in the application of regularization theory to classification. In this paper, we review some important convex loss functions, including hinge loss, square loss, modified square loss, exponential loss, logistic regression loss, as well as some non-convex loss functions, such as sigmoid loss, ø-loss, ramp loss, normalized sigmoid loss, and the loss function of 2 layer neural network. Based on the analysis of these loss functions, we propose a new differentiable non-convex loss function, called smoothed 0-1 loss function, which is a natural approximation of the 0-1 loss function. To compare the performance of different loss functions, we propose two binary classification algorithms for binary classification, one for convex loss functions, the other for non-convex loss functions. A set of experiments are launched on several binary data sets from the UCI repository. The results show that the proposed smoothed 0-1 loss function is robust, especially for those noisy data sets with many outliers. © 2010 IEEE.

History

Pagination

1281-1288

Location

Sydney, N.S.W.

Start date

2010-12-13

End date

2010-12-13

ISSN

1550-4786

ISBN-13

9780769542577

Publication classification

EN.1 Other conference paper

Title of proceedings

Proceedings - IEEE International Conference on Data Mining, ICDM

Publisher

IEEE

Place of publication

Piscataway, N.J.

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC