File(s) under permanent embargo
Supervised subspace learning with multi-class Lagrangian SVM on the Grassmann manifold
conference contribution
posted on 2011-01-01, 00:00 authored by D S Pham, Svetha VenkateshSvetha VenkateshLearning robust subspaces to maximize class discrimination is challenging, and most current works consider a weak connection between dimensionality reduction and classifier design. We propose an alternate framework wherein these two steps are combined in a joint formulation to exploit the direct connection between dimensionality reduction and classification. Specifically, we learn an optimal subspace on the Grassmann manifold jointly minimizing the classification error of an SVM classifier. We minimize the regularized empirical risk over both the hypothesis space of functions that underlies this new generalized multi-class Lagrangian SVM and the Grassmann manifold such that a linear projection is to be found. We propose an iterative algorithm to meet the dual goal of optimizing both the classifier and projection. Extensive numerical studies on challenging datasets show robust performance of the proposed scheme over other alternatives in contexts wherein limited training data is used, verifying the advantage of the joint formulation.
History
Event
Advances in Artifical Intelligence. Conference (24th : 2011 : Perth, Western Australia)Source
AI 2011 : advances in artificial intelligence : 24th Australasian Joint Conference, Perth, Australia, December 5-8, 2011 : proceedingsSeries
Lecture notes in artificial intelligence ; 7106Pagination
241 - 250Publisher
Springer-VerlagLocation
Perth, Western AustraliaPlace of publication
Berlin, GermanyPublisher DOI
Start date
2011-12-05End date
2011-12-08ISSN
0302-9743ISBN-13
9783642258312ISBN-10
364225831XLanguage
engPublication classification
E1.1 Full written paper - refereedCopyright notice
2011, Springer-Verlag Berlin HeidelbergExtent
82Editor/Contributor(s)
D Wang, M ReynoldsTitle of proceedings
AI 2011 : Advances in Artificial Intelligence : 24th Australasian Joint Conference, Perth, Australia, December 5-8, 2011 : proceedingsUsage metrics
Categories
No categories selectedKeywords
classification errorsclassifier designdata setsdimensionality reductionempirical risksGrassmann manifoldhypothesis spaceiterative algorithmLagrangianlimited training datalinear projectionsmulti-classnumerical studiesoptimal subspacerobust performancesubspace learningSVM classifiersweak connectionScience & TechnologyTechnologyComputer Science, Artificial IntelligenceComputer ScienceILLUMINATIONRECOGNITION
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC