Deakin University
Browse

Exploiting feature relationships towards stable feature selection

Version 2 2024-06-05, 11:49
Version 1 2016-03-07, 18:20
conference contribution
posted on 2024-06-05, 11:49 authored by I Kamkar, S Gupta, QD Phung, Svetha VenkateshSvetha Venkatesh
Feature selection is an important step in building predictive models for most real-world problems. One of the popular methods in feature selection is Lasso. However, it shows instability in selecting features when dealing with correlated features. In this work, we propose a new method that aims to increase the stability of Lasso by encouraging similarities between features based on their relatedness, which is captured via a feature covariance matrix. Besides modeling positive feature correlations, our method can also identify negative correlations between features. We propose a convex formulation for our model along with an alternating optimization algorithm that can learn the weights of the features as well as the relationship between them. Using both synthetic and real-world data, we show that the proposed method is more stable than Lasso and many state-of-the-art shrinkage and feature selection methods. Also, its predictive performance is comparable to other methods.

History

Pagination

1-10

Location

Paris, France

Start date

2015-10-19

End date

2015-10-21

ISBN-13

9781467382724

Language

eng

Publication classification

E Conference publication, E1 Full written paper - refereed

Copyright notice

2015, IEEE

Editor/Contributor(s)

Gaussier E, Cao LB, Gallinari P, Kwok J, Pasi G, Zaiane O

Title of proceedings

DSAA 2015: IEEE International Conference on Data Science and Advanced Analytics

Event

Data Science and Advanced Analytics. Conference (2015 : Paris, France)

Publisher

IEEE

Place of publication

Piscataway, N.J.