Stable feature selection with support vector machines

Kamkar, Iman, Gupta, Sunil K., Phung, Dinh and Venkatesh, Svetha 2015, Stable feature selection with support vector machines. In Pfahringer, Bernhard and Renz, Jochen (ed), AI 2015: Advances in artificial intelligence. 28th Australasian Joint Conference Canberra, ACT, Australia, November 30 - December 4, 2015 Proceedings, Springer, Berlin, Germany, pp.298-308, doi: 10.1007/978-3-319-26350-2_26.

Attached Files
Name Description MIMEType Size Downloads

Title Stable feature selection with support vector machines
Author(s) Kamkar, Iman
Gupta, Sunil K.ORCID iD for Gupta, Sunil K. orcid.org/0000-0002-3308-1930
Phung, DinhORCID iD for Phung, Dinh orcid.org/0000-0002-9977-8247
Venkatesh, SvethaORCID iD for Venkatesh, Svetha orcid.org/0000-0001-8675-6631
Title of book AI 2015: Advances in artificial intelligence. 28th Australasian Joint Conference Canberra, ACT, Australia, November 30 - December 4, 2015 Proceedings
Editor(s) Pfahringer, Bernhard
Renz, Jochen
Publication date 2015
Series Lecture notes in computer science; v.9457
Chapter number 25
Total chapters 57
Start page 298
End page 308
Total pages 11
Publisher Springer
Place of Publication Berlin, Germany
Keyword(s) Science & Technology
Technology
Computer Science, Artificial Intelligence
Robotics
Computer Science
REGRESSION SHRINKAGE
VARIABLE SELECTION
MODEL SELECTION
BREAST-CANCER
LASSO
PREDICTOR
SIGNATURE
MORTALITY
SURVIVAL
Summary The support vector machine (SVM) is a popular method for classification, well known for finding the maximum-margin hyperplane. Combining SVM with l1-norm penalty further enables it to simultaneously perform feature selection and margin maximization within a single framework. However, l1-norm SVM shows instability in selecting features in presence of correlated features. We propose a new method to increase the stability of l1-norm SVM by encouraging similarities between feature weights based on feature correlations, which is captured via a feature covariance matrix. Our proposed method can capture both positive and negative correlations between features. We formulate the model as a convex optimization problem and propose a solution based on alternating minimization. Using both synthetic and real-world datasets, we show that our model achieves better stability and classification accuracy compared to several state-of-the-art regularized classification methods.
ISBN 9783319263502
ISSN 0302-9743
1611-3349
Language eng
DOI 10.1007/978-3-319-26350-2_26
Field of Research 080109 Pattern Recognition and Data Mining
08 Information And Computing Sciences
Socio Economic Objective 970108 Expanding Knowledge in the Information and Computing Sciences
HERDC Research category B1 Book chapter
ERA Research output type B Book chapter
Copyright notice ©2015, Springer
Persistent URL http://hdl.handle.net/10536/DRO/DU:30081381

Document type: Book Chapter
Collection: Centre for Pattern Recognition and Data Analytics
Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 2 times in TR Web of Science
Scopus Citation Count Cited 3 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 269 Abstract Views, 3 File Downloads  -  Detailed Statistics
Created: Mon, 15 Feb 2016, 16:01:23 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.