Deakin University
Browse

Communication Efficient Framework for Decentralized Machine Learning

Version 2 2024-06-05, 07:15
Version 1 2020-07-06, 16:03
conference contribution
posted on 2024-06-05, 07:15 authored by A Elgabli, Jihong ParkJihong Park, AS Bedi, M Bennis, V Aggarwal
© 2020 IEEE. In this paper, we propose a fast, privacy-aware, and communication-efficient decentralized framework to solve the distributed machine learning (DML) problem. The proposed algorithm is based on the Alternating Direction Method of Multipliers (ADMM) algorithm. The key novelty in the proposed algorithm is that it solves the problem in a decentralized topology where at most half of the workers are competing the limited communication resources at any given time. Moreover, each worker exchanges the locally trained model only with two neighboring workers, thereby training a global model with a lower amount of communication overhead in each exchange. We prove that GADMM converges faster than the centralized batch gradient descent for convex loss functions, and numerically show that it converges faster and more communication-efficient than the state-of-the-art communication-efficient algorithms such as the Lazily Aggregated Gradient (LAG) and dual averaging, in linear and logistic regression tasks on synthetic and real datasets.

History

Pagination

1-5

Location

Princeton, NJ, USA

Start date

2020-03-18

End date

2020-03-20

ISBN-13

9781728140858

Language

eng

Notes

Conference cancelled due to COVID-19

Publication classification

E1.1 Full written paper - refereed

Title of proceedings

CISS 2020 : Proceedings of the Information Sciences and Systems 2020 conference

Event

IEEE Information Theory Society. Conference (54th. 2020 : Princeton, N.J.)

Publisher

Institute of Electrical and Electronics Engineers

Place of publication

Piscataway, N.J.

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC