Deakin University
Browse

File(s) under permanent embargo

Exploiting feature relationships towards stable feature selection

conference contribution
posted on 2015-01-01, 00:00 authored by Iman Kamkar, S Gupta, Quoc-Dinh Phung, Svetha VenkateshSvetha Venkatesh
Feature selection is an important step in building predictive models for most real-world problems. One of the popular methods in feature selection is Lasso. However, it shows instability in selecting features when dealing with correlated features. In this work, we propose a new method that aims to increase the stability of Lasso by encouraging similarities between features based on their relatedness, which is captured via a feature covariance matrix. Besides modeling positive feature correlations, our method can also identify negative correlations between features. We propose a convex formulation for our model along with an alternating optimization algorithm that can learn the weights of the features as well as the relationship between them. Using both synthetic and real-world data, we show that the proposed method is more stable than Lasso and many state-of-the-art shrinkage and feature selection methods. Also, its predictive performance is comparable to other methods.

History

Event

Data Science and Advanced Analytics. Conference (2015 : Paris, France)

Pagination

1 - 10

Publisher

IEEE

Location

Paris, France

Place of publication

Piscataway, N.J.

Start date

2015-10-19

End date

2015-10-21

ISBN-13

9781467382724

Language

eng

Publication classification

E Conference publication; E1 Full written paper - refereed

Copyright notice

2015, IEEE

Editor/Contributor(s)

E Gaussier, L Cao, P Gallinari, J Kwok, G Pasi, O Zaiane

Title of proceedings

DSAA 2015: IEEE International Conference on Data Science and Advanced Analytics