Deakin University
Browse

File(s) under permanent embargo

Q-GADMM: Quantized group ADMM for communication efficient decentralized machine learning

Version 2 2024-06-05, 07:15
Version 1 2020-09-29, 12:08
conference contribution
posted on 2024-06-05, 07:15 authored by A Elgabli, Jihong ParkJihong Park, AS Bedi, M Bennis, V Aggarwal
In this paper, we propose a communication-efficient decen-tralized machine learning (ML) algorithm, coined quantized group ADMM (Q-GADMM). Every worker in Q-GADMM communicates only with two neighbors, and updates its model via the group alternating direct method of multiplier (GADMM), thereby ensuring fast convergence while reducing the number of communication rounds. Furthermore, each worker quantizes its model updates before transmissions, thereby decreasing the communication payload sizes. We prove that Q-GADMM converges to the optimal solution for convex loss functions, and numerically show that Q-GADMM yields 7x less communication cost while achieving almost the same accuracy and convergence speed compared to GADMM without quantization.

History

Volume

2020-May

Pagination

8876-8880

Location

Online : Barcelona, Spain

Start date

2020-05-04

End date

2020-05-08

ISSN

1520-6149

ISBN-13

9781509066315

Language

eng

Publication classification

E1.1 Full written paper - refereed

Copyright notice

2020, IEEE

Title of proceedings

ICASSP 2020 : Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing

Event

ICASSP - IEEE Acoustics, Speech and Signal Processing. International Conference (2020 : Barcelona, Spain)

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Place of publication

Piscataway, N.J.

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC