Communication Efficient Framework for Decentralized Machine Learning
Version 2 2024-06-05, 07:15Version 2 2024-06-05, 07:15
Version 1 2020-07-06, 16:03Version 1 2020-07-06, 16:03
conference contribution
posted on 2024-06-05, 07:15 authored by A Elgabli, Jihong ParkJihong Park, AS Bedi, M Bennis, V Aggarwal© 2020 IEEE. In this paper, we propose a fast, privacy-aware, and communication-efficient decentralized framework to solve the distributed machine learning (DML) problem. The proposed algorithm is based on the Alternating Direction Method of Multipliers (ADMM) algorithm. The key novelty in the proposed algorithm is that it solves the problem in a decentralized topology where at most half of the workers are competing the limited communication resources at any given time. Moreover, each worker exchanges the locally trained model only with two neighboring workers, thereby training a global model with a lower amount of communication overhead in each exchange. We prove that GADMM converges faster than the centralized batch gradient descent for convex loss functions, and numerically show that it converges faster and more communication-efficient than the state-of-the-art communication-efficient algorithms such as the Lazily Aggregated Gradient (LAG) and dual averaging, in linear and logistic regression tasks on synthetic and real datasets.
History
Pagination
1-5Location
Princeton, NJ, USAStart date
2020-03-18End date
2020-03-20ISBN-13
9781728140858Language
engNotes
Conference cancelled due to COVID-19Publication classification
E1.1 Full written paper - refereedTitle of proceedings
CISS 2020 : Proceedings of the Information Sciences and Systems 2020 conferenceEvent
IEEE Information Theory Society. Conference (54th. 2020 : Princeton, N.J.)Publisher
Institute of Electrical and Electronics EngineersPlace of publication
Piscataway, N.J.Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC