Deakin University
Browse

File(s) under permanent embargo

Integrating joint feature selection into subspace learning: a formulation of 2DPCA for outliers robust feature selection

journal contribution
posted on 2020-01-01, 00:00 authored by Imran RazzakImran Razzak, Raghib Abu Saris, Michael Blumenstein, Guandong Xu
Since the principal component analysis and its variants are sensitive to outliers that affect their performance and applicability in real world, several variants have been proposed to improve the robustness. However, most of the existing methods are still sensitive to outliers and are unable to select useful features. To overcome the issue of sensitivity of PCA against outliers, in this paper, we introduce two-dimensional outliers-robust principal component analysis (ORPCA) by imposing the joint constraints on the objective function. ORPCA relaxes the orthogonal constraints and penalizes the regression coefficient, thus, it selects important features and ignores the same features that exist in other principal components. It is commonly known that square Frobenius norm is sensitive to outliers. To overcome this issue, we have devised an alternative way to derive objective function. Experimental results on four publicly available benchmark datasets show the effectiveness of joint feature selection and provide better performance as compared to state-of-the-art dimensionality-reduction methods.

History

Journal

Neural networks

Volume

121

Pagination

441 - 451

Publisher

Elsevier

Location

Amsterda, The Netherlands

ISSN

0893-6080

Language

eng

Publication classification

C1.1 Refereed article in a scholarly journal