File(s) under permanent embargo
L-FGADMM: Layer-Wise Federated Group ADMM for Communication Efficient Decentralized Deep Learning
conference contribution
posted on 2020-01-01, 00:00 authored by A Elgabli, Jihong ParkJihong Park, S Ahmed, M Bennis© 2020 IEEE. This article proposes a communication-efficient decentralized deep learning algorithm, coined layer-wise federated group ADMM (L-FGADMM). To minimize an empirical risk, every worker in L-FGADMM periodically communicates with two neighbors, in which the periods are separately adjusted for different layers of its deep neural network. A constrained optimization problem for this setting is formulated and solved using the stochastic version of GADMM proposed in our prior work. Numerical evaluations show that by less frequently exchanging the largest layer, L-FGADMM can significantly reduce the communication cost, without compromising the convergence speed. Surprisingly, despite less exchanged information and decentralized operations, intermittently skipping the largest layer consensus in L-FGADMM creates a regularizing effect, thereby achieving the test accuracy as high as federated learning (FL), a baseline method with the entire layer consensus by the aid of a central entity.
History
Volume
2020-MayPagination
1-6Location
Seoul, South KoreaPublisher DOI
Start date
2020-05-25End date
2020-05-28ISSN
1525-3511ISBN-13
9781728131061Language
engPublication classification
E1.1 Full written paper - refereedTitle of proceedings
WCNC 2020 : Proceedings of the 2020 IEEE Wireless Communications and Networking ConferenceEvent
Wireless Communications and Networking Conference (2020 : Seoul, South Korea)Publisher
IEEEPlace of publication
Piscataway, N.J.Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC