Deakin University
Browse

File(s) under permanent embargo

Nonparametric budgeted stochastic gradient descent

Version 2 2024-06-05, 11:50
Version 1 2019-07-02, 13:45
conference contribution
posted on 2024-06-05, 11:50 authored by T Le, V Nguyen, TD Nguyen, D Phung
One of the most challenging problems in kernel online learning is to bound the model size. Budgeted kernel online learning addresses this issue by bounding the model size to a predefined budget. However, determining an appropriate value for such predefined budget is arduous. In this paper, we propose the Nonparametric Budgeted Stochastic Gradient Descent that allows the model size to automatically grow with data in a principled way. We provide theoretical analysis to show that our framework is guaranteed to converge for a large collection of loss functions (e.g. Hinge, Logistic, L2, L1, and ε-insensitive) which enables the proposed algorithm to perform both classification and regression tasks without hurting the ideal convergence rate O(1/T)of the standard Stochastic Gradient Descent. We validate our algorithm on the real-world datasets to consolidate the theoretical claims.

History

Volume

51

Pagination

564-572

Location

Cadiz, Spain

Start date

2016-05-09

End date

2016-05-11

ISSN

2640-3498

Language

eng

Publication classification

E1 Full written paper - refereed

Copyright notice

2016, the authors

Editor/Contributor(s)

Gretton A, Robert CC

Title of proceedings

AISTATS 2016 : Proceedings of the 19th International Conference on Artificial Intelligence and Statistics

Event

Artificial Intelligence and Statistics. Conference (19th : 2016 : Cadiz, Spain)

Publisher

JMLR

Place of publication

Cambridge, Mass.

Series

Artificial Intelligence and Statistics Conference

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC