Deakin University
Browse

Dual space gradient descent for online learning

Version 2 2024-06-05, 11:50
Version 1 2017-05-25, 22:40
conference contribution
posted on 2024-06-05, 11:50 authored by T Le, TD Nguyen, V Nguyen, D Phung
One crucial goal in kernel online learning is to bound the model size. Common approaches employ budget maintenance procedures to restrict the model sizes using removal, projection, or merging strategies. Although projection and merging, in the literature, are known to be the most effective strategies, they demand extensive computation whilst removal strategy fails to retain information of the removed vectors. An alternative way to address the model size problem is to apply random features to approximate the kernel function. This allows the model to be maintained directly in the random feature space, hence effectively resolve the curse of kernelization. However, this approach still suffers from a serious shortcoming as it needs to use a high dimensional random feature space to achieve a sufficiently accurate kernel approximation. Consequently, it leads to a significant increase in the computational cost. To address all of these aforementioned challenges, we present in this paper the Dual Space Gradient Descent (DualSGD), a novel framework that utilizes random features as an auxiliary space to maintain information from data points removed during budget maintenance. Consequently, our approach permits the budget to be maintained in a simple, direct and elegant way while simultaneously mitigating the impact of the dimensionality issue on learning performance. We further provide convergence analysis and extensively conduct experiments on five real-world datasets to demonstrate the predictive performance and scalability of our proposed method in comparison with the state-of-the-art baselines.

History

Volume

29

Pagination

1-9

Location

Barcelona, Spain

Start date

2016-12-05

End date

2016-12-10

ISSN

1049-5258

Language

eng

Publication classification

E Conference publication, E1 Full written paper - refereed

Copyright notice

[2016, NIPS]

Editor/Contributor(s)

Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R

Title of proceedings

NIPS 2016 : Advances in neural information processing systems : Proceedings of the 30th Conference on Neural Information Processing Systems

Event

Neural Information Processing Systems Foundation. Conference (30th : 2016 : Barcelona, Spain)

Publisher

Neural Information Processing Systems

Place of publication

[Cambridge, Mass.]

Series

Neural Information Processing Systems Foundation Conference