Deakin University
Browse

File(s) under permanent embargo

L-FGADMM: Layer-Wise Federated Group ADMM for Communication Efficient Decentralized Deep Learning

conference contribution
posted on 2020-01-01, 00:00 authored by A Elgabli, Jihong ParkJihong Park, S Ahmed, M Bennis
© 2020 IEEE. This article proposes a communication-efficient decentralized deep learning algorithm, coined layer-wise federated group ADMM (L-FGADMM). To minimize an empirical risk, every worker in L-FGADMM periodically communicates with two neighbors, in which the periods are separately adjusted for different layers of its deep neural network. A constrained optimization problem for this setting is formulated and solved using the stochastic version of GADMM proposed in our prior work. Numerical evaluations show that by less frequently exchanging the largest layer, L-FGADMM can significantly reduce the communication cost, without compromising the convergence speed. Surprisingly, despite less exchanged information and decentralized operations, intermittently skipping the largest layer consensus in L-FGADMM creates a regularizing effect, thereby achieving the test accuracy as high as federated learning (FL), a baseline method with the entire layer consensus by the aid of a central entity.

History

Volume

2020-May

Pagination

1-6

Location

Seoul, South Korea

Start date

2020-05-25

End date

2020-05-28

ISSN

1525-3511

ISBN-13

9781728131061

Language

eng

Publication classification

E1.1 Full written paper - refereed

Title of proceedings

WCNC 2020 : Proceedings of the 2020 IEEE Wireless Communications and Networking Conference

Event

Wireless Communications and Networking Conference (2020 : Seoul, South Korea)

Publisher

IEEE

Place of publication

Piscataway, N.J.

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC