Deakin University
Browse

Robust Federated Averaging via Outlier Pruning

Version 2 2024-05-30, 16:19
Version 1 2022-01-21, 08:16
journal contribution
posted on 2024-05-30, 16:19 authored by MP Uddin, Yong XiangYong Xiang, John YearwoodJohn Yearwood, Longxiang GaoLongxiang Gao
Federated Averaging (FedAvg) is the baseline Federated Learning (FL) algorithm that applies the stochastic gradient descent for local model training and the arithmetic averaging of the local models parameters for global model aggregation. Succeeding FL works commonly utilize the arithmetic averaging scheme of FedAvg for the aggregation. However, such arithmetic averaging is prone to the outlier model-updates, especially when the clients data are non-Independent and Identically Distributed (non-IID). As such, the classical aggregation approach suffers from the dominance of the outlier updates and, consequently, causes high communication costs towards producing a decent global model. In this letter, we propose a robust aggregation strategy to alleviate the above issues. In particular, we propose first pruning the node-wise outlier updates (weights) from the local trained models and then performing the aggregation on the selected effective weights-set at each node. We provide the theoretical result of our method and conduct extensive experiments on the MNIST, CIFAR-10, and Shakespeare datasets with IID and non-IID settings, which demonstrate that our aggregation approach outperforms the state-of-the-art methods in terms of communication speedup, test-set performance and training convergence.

History

Journal

IEEE Signal Processing Letters

Volume

29

Pagination

409-413

Location

Piscataway, NJ

ISSN

1070-9908

eISSN

1558-2361

Language

English

Publication classification

C1 Refereed article in a scholarly journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC