You are not logged in.

Sparse adaptive multi-hyperplane machine

Nguyen, Khanh, Le, Trung, Nguyen, Vu and Phung, Dinh 2016, Sparse adaptive multi-hyperplane machine. In Bailey, James, Khan, Latifur, Washio, Takashi, Dobbie, Gillian, Huang, Joshua Zhexue and Wang, Ruili (ed), Advances in knowledge discovery and data mining: 20th Pacific-Asia Conference, PAKDD 2016 Auckland, New Zealand, April 19–22, 2016 Proceedings, Part I, Springer, New York, N.Y., pp.27-39, doi: 10.1007/978-3-319-31753-3_3.

Attached Files
Name Description MIMEType Size Downloads

Title Sparse adaptive multi-hyperplane machine
Author(s) Nguyen, Khanh
Le, Trung
Nguyen, Vu
Phung, DinhORCID iD for Phung, Dinh orcid.org/0000-0002-9977-8247
Title of book Advances in knowledge discovery and data mining: 20th Pacific-Asia Conference, PAKDD 2016 Auckland, New Zealand, April 19–22, 2016 Proceedings, Part I
Editor(s) Bailey, James
Khan, Latifur
Washio, Takashi
Dobbie, Gillian
Huang, Joshua Zhexue
Wang, Ruili
Publication date 2016
Series Lecture Notes in Computer Science, v.9651
Chapter number 3
Total chapters 47
Start page 27
End page 39
Total pages 13
Publisher Springer
Place of Publication New York, N.Y.
Summary The Adaptive Multiple-hyperplane Machine (AMM) was recently proposed to deal with large-scale datasets. However, it has no principle to tune the complexity and sparsity levels of the solution. Addressing the sparsity is important to improve learning generalization, prediction accuracy and computational speedup. In this paper, we employ the max-margin principle and sparse approach to propose a new Sparse AMM (SAMM). We solve the new optimization objective function with stochastic gradient descent (SGD). Besides inheriting the good features of SGD-based learning method and the original AMM, our proposed Sparse AMM provides machinery and flexibility to tune the complexity and sparsity of the solution, making it possible to avoid overfitting and underfitting. We validate our approach on several large benchmark datasets. We show that with the ability to control sparsity, the proposed Sparse AMM yields superior classification accuracy to the original AMM while simultaneously achieving computational speedup.
ISBN 9783319317533
ISSN 0302-9743
Language eng
DOI 10.1007/978-3-319-31753-3_3
Field of Research 080109 Pattern Recognition and Data Mining
Socio Economic Objective 970108 Expanding Knowledge in the Information and Computing Sciences
HERDC Research category B1 Book chapter
ERA Research output type B Book chapter
Copyright notice ©2016, Springer
Persistent URL http://hdl.handle.net/10536/DRO/DU:30083509

Document type: Book Chapter
Collection: Centre for Pattern Recognition and Data Analytics
Connect to link resolver
 
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 0 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 141 Abstract Views, 2 File Downloads  -  Detailed Statistics
Created: Tue, 17 May 2016, 12:44:19 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.