File(s) under permanent embargo
Mixing linear SVMs for nonlinear classification
journal contribution
posted on 2010-12-01, 00:00 authored by Zhouyu Fu, Antonio Robles-KellyAntonio Robles-Kelly, Jun ZhouIn this paper, we address the problem of combining linear support vector machines (SVMs) for classification of large-scale nonlinear datasets. The motivation is to exploit both the efficiency of linear SVMs (LSVMs) in learning and prediction and the power of nonlinear SVMs in classification. To this end, we develop a LSVM mixture model that exploits a divide-and-conquer strategy by partitioning the feature space into subregions of linearly separable datapoints and learning a LSVM for each of these regions. We do this implicitly by deriving a generative model over the joint data and label distributions. Consequently, we can impose priors on the mixing coefficients and do implicit model selection in a top-down manner during the parameter estimation process. This guarantees the sparsity of the learned model. Experimental results show that the proposed method can achieve the efficiency of LSVMs in the prediction phase while still providing a classification performance comparable to nonlinear SVMs.
History
Journal
IEEE transactions on neural networksVolume
21Issue
12Pagination
1963 - 1975Publisher
Institute of Electrical and Electronics EngineersLocation
Piscataway, N.J.Publisher DOI
ISSN
1045-9227Language
engPublication classification
C1.1 Refereed article in a scholarly journalCopyright notice
2010, IEEEUsage metrics
Categories
No categories selectedKeywords
ClassificationExpectation-maximization algorithmMixture of expertsModel selectionSupport vector machinesScience & TechnologyTechnologyComputer Science, Artificial IntelligenceComputer Science, Hardware & ArchitectureComputer Science, Theory & MethodsEngineering, Electrical & ElectronicComputer ScienceEngineeringMIXTURESELECTION
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC