You are not logged in.

Supervised subspace learning with multi-class Lagrangian SVM on the Grassmann manifold

Pham, Duc-Son and Venkatesh, Svetha 2011, Supervised subspace learning with multi-class Lagrangian SVM on the Grassmann manifold, in AI 2011 : Advances in Artificial Intelligence : 24th Australasian Joint Conference, Perth, Australia, December 5-8, 2011 : proceedings, Springer-Verlag, Berlin, Germany, pp. 241-250, doi: 10.1007/978-3-642-25832-9_25.

Attached Files
Name Description MIMEType Size Downloads

Title Supervised subspace learning with multi-class Lagrangian SVM on the Grassmann manifold
Author(s) Pham, Duc-Son
Venkatesh, SvethaORCID iD for Venkatesh, Svetha orcid.org/0000-0001-8675-6631
Conference name Advances in Artifical Intelligence. Conference (24th : 2011 : Perth, Western Australia)
Conference location Perth, Western Australia
Conference dates 5-8 Dec. 2011
Title of proceedings AI 2011 : Advances in Artificial Intelligence : 24th Australasian Joint Conference, Perth, Australia, December 5-8, 2011 : proceedings
Editor(s) Wang, Dianhui
Reynolds, Mark
Publication date 2011
Series Lecture notes in artificial intelligence ; 7106
Conference series Advances in Artifical Intelligence Conference
Start page 241
End page 250
Total pages 10
Publisher Springer-Verlag
Place of publication Berlin, Germany
Keyword(s) classification errors
classifier design
data sets
dimensionality reduction
empirical risks
Grassmann manifold
hypothesis space
iterative algorithm
Lagrangian
limited training data
linear projections
multi-class
numerical studies
optimal subspace
robust performance
subspace learning
SVM classifiers
weak connection
Summary Learning robust subspaces to maximize class discrimination is challenging, and most current works consider a weak connection between dimensionality reduction and classifier design. We propose an alternate framework wherein these two steps are combined in a joint formulation to exploit the direct connection between dimensionality reduction and classification. Specifically, we learn an optimal subspace on the Grassmann manifold jointly minimizing the classification error of an SVM classifier. We minimize the regularized empirical risk over both the hypothesis space of functions that underlies this new generalized multi-class Lagrangian SVM and the Grassmann manifold such that a linear projection is to be found. We propose an iterative algorithm to meet the dual goal of optimizing both the classifier and projection. Extensive numerical studies on challenging datasets show robust performance of the proposed scheme over other alternatives in contexts wherein limited training data is used, verifying the advantage of the joint formulation.
ISBN 9783642258312
364225831X
ISSN 0302-9743
Language eng
DOI 10.1007/978-3-642-25832-9_25
Field of Research 089999 Information and Computing Sciences not elsewhere classified
Socio Economic Objective 970108 Expanding Knowledge in the Information and Computing Sciences
HERDC Research category E1.1 Full written paper - refereed
Copyright notice ©2011, Springer-Verlag Berlin Heidelberg
Persistent URL http://hdl.handle.net/10536/DRO/DU:30044657

Document type: Conference Paper
Collection: School of Information Technology
Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 2 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 285 Abstract Views, 4 File Downloads  -  Detailed Statistics
Created: Fri, 20 Apr 2012, 13:23:40 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.