Deakin University
Browse

File(s) under permanent embargo

Mixing linear SVMs for nonlinear classification

journal contribution
posted on 2010-12-01, 00:00 authored by Zhouyu Fu, Antonio Robles-KellyAntonio Robles-Kelly, Jun Zhou
In this paper, we address the problem of combining linear support vector machines (SVMs) for classification of large-scale nonlinear datasets. The motivation is to exploit both the efficiency of linear SVMs (LSVMs) in learning and prediction and the power of nonlinear SVMs in classification. To this end, we develop a LSVM mixture model that exploits a divide-and-conquer strategy by partitioning the feature space into subregions of linearly separable datapoints and learning a LSVM for each of these regions. We do this implicitly by deriving a generative model over the joint data and label distributions. Consequently, we can impose priors on the mixing coefficients and do implicit model selection in a top-down manner during the parameter estimation process. This guarantees the sparsity of the learned model. Experimental results show that the proposed method can achieve the efficiency of LSVMs in the prediction phase while still providing a classification performance comparable to nonlinear SVMs.

History

Journal

IEEE transactions on neural networks

Volume

21

Pagination

1963-1975

Location

Piscataway, N.J.

ISSN

1045-9227

Language

eng

Publication classification

C1.1 Refereed article in a scholarly journal

Copyright notice

2010, IEEE

Issue

12

Publisher

Institute of Electrical and Electronics Engineers

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC