Stable feature selection with support vector machines
Kamkar, Iman, Gupta, Sunil K., Phung, Dinh and Venkatesh, Svetha 2015, Stable feature selection with support vector machines. In Pfahringer, Bernhard and Renz, Jochen (ed), AI 2015: Advances in artificial intelligence. 28th Australasian Joint Conference Canberra, ACT, Australia, November 30 - December 4, 2015 Proceedings, Springer, Berlin, Germany, pp.298-308, doi: 10.1007/978-3-319-26350-2_26.
Attached Files
Name
Description
MIMEType
Size
Downloads
Title
Stable feature selection with support vector machines
The support vector machine (SVM) is a popular method for classification, well known for finding the maximum-margin hyperplane. Combining SVM with l1-norm penalty further enables it to simultaneously perform feature selection and margin maximization within a single framework. However, l1-norm SVM shows instability in selecting features in presence of correlated features. We propose a new method to increase the stability of l1-norm SVM by encouraging similarities between feature weights based on feature correlations, which is captured via a feature covariance matrix. Our proposed method can capture both positive and negative correlations between features. We formulate the model as a convex optimization problem and propose a solution based on alternating minimization. Using both synthetic and real-world datasets, we show that our model achieves better stability and classification accuracy compared to several state-of-the-art regularized classification methods.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.