Deakin University
Browse

File(s) under permanent embargo

Stable feature selection with support vector machines

Version 2 2024-06-03, 17:12
Version 1 2016-02-15, 17:00
chapter
posted on 2024-06-03, 17:12 authored by I Kamkar, Sunil GuptaSunil Gupta, Q Phung, Svetha VenkateshSvetha Venkatesh
The support vector machine (SVM) is a popular method for classification, well known for finding the maximum-margin hyperplane. Combining SVM with l1-norm penalty further enables it to simultaneously perform feature selection and margin maximization within a single framework. However, l1-norm SVM shows instability in selecting features in presence of correlated features. We propose a new method to increase the stability of l1-norm SVM by encouraging similarities between feature weights based on feature correlations, which is captured via a feature covariance matrix. Our proposed method can capture both positive and negative correlations between features. We formulate the model as a convex optimization problem and propose a solution based on alternating minimization. Using both synthetic and real-world datasets, we show that our model achieves better stability and classification accuracy compared to several state-of-the-art regularized classification methods.

History

Volume

9457

Chapter number

25

Pagination

298-308

ISSN

0302-9743

eISSN

1611-3349

ISBN-13

9783319263502

Language

eng

Publication classification

B Book chapter, B1 Book chapter

Copyright notice

2015, Springer

Extent

57

Editor/Contributor(s)

Pfahringer B, Renz J

Publisher

Springer

Place of publication

Berlin, Germany

Title of book

AI 2015: Advances in artificial intelligence. 28th Australasian Joint Conference Canberra, ACT, Australia, November 30 - December 4, 2015 Proceedings

Series

Lecture notes in computer science; v.9457

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC