Deakin University
Browse

File(s) not publicly available

Robust federated learning under statistical heterogeneity via hessian-weighted aggregation

journal contribution
posted on 2023-02-20, 02:07 authored by A Ahmad, Wei LuoWei Luo, Antonio Robles-KellyAntonio Robles-Kelly
In federated learning, client models are often trained on local training sets that vary in size and distribution. Such statistical heterogeneity in training data leads to performance variations across local models. Even within a model, some parameter estimates can be more reliable than others. Most existing FL approaches (such as FedAvg), however, do not explicitly address such variations in client parameter estimates and treat all local parameters with equal importance in the model aggregation. This disregard of varying evidential credence among client models often leads to slow convergence and a sensitive global model. We address this gap by proposing an aggregation mechanism based upon the Hessian matrix. Further, by making use of the first-order information of the loss function, we can use the Hessian as a scaling matrix in a manner akin to that employed in Quasi-Newton methods. This treatment captures the impact of data quality variations across local models. Experiments show that our method is superior to the baselines of Federated Average (FedAvg), FedProx, Federated Curvature (FedCurv) and Federated Newton Learn (FedNL) for image classification on MNIST, Fashion-MNIST, and CIFAR-10 datasets when the client models are trained using statistically heterogeneous data.

History

Journal

Machine Learning

Volume

112

Location

Berlin, Germany

ISSN

0885-6125

eISSN

1573-0565

Language

English

Publication classification

C1 Refereed article in a scholarly journal

Issue

2

Publisher

SPRINGER