Deakin University
Browse

File(s) under permanent embargo

Stable feature selection with support vector machines

chapter
posted on 2015-01-01, 00:00 authored by Iman Kamkar, Sunil GuptaSunil Gupta, Quoc-Dinh Phung, Svetha VenkateshSvetha Venkatesh
The support vector machine (SVM) is a popular method for classification, well known for finding the maximum-margin hyperplane. Combining SVM with l1-norm penalty further enables it to simultaneously perform feature selection and margin maximization within a single framework. However, l1-norm SVM shows instability in selecting features in presence of correlated features. We propose a new method to increase the stability of l1-norm SVM by encouraging similarities between feature weights based on feature correlations, which is captured via a feature covariance matrix. Our proposed method can capture both positive and negative correlations between features. We formulate the model as a convex optimization problem and propose a solution based on alternating minimization. Using both synthetic and real-world datasets, we show that our model achieves better stability and classification accuracy compared to several state-of-the-art regularized classification methods.

History

Title of book

AI 2015: Advances in artificial intelligence. 28th Australasian Joint Conference Canberra, ACT, Australia, November 30 - December 4, 2015 Proceedings

Volume

9457

Series

Lecture notes in computer science; v.9457

Chapter number

25

Pagination

298 - 308

Publisher

Springer

Place of publication

Berlin, Germany

ISSN

0302-9743

eISSN

1611-3349

ISBN-13

9783319263502

Language

eng

Publication classification

B Book chapter; B1 Book chapter

Copyright notice

2015, Springer

Extent

57

Editor/Contributor(s)

B Pfahringer, J Renz

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC