File(s) under permanent embargo

Mutual Information Driven Federated Learning

journal contribution
posted on 2021-07-01, 00:00 authored by Md Palash Uddin, Yong XiangYong Xiang, Xuequan LuXuequan Lu, John YearwoodJohn Yearwood, Longxiang Gao
Federated Learning (FL) is an emerging research field that yields a global trained model from different local clients without violating data privacy. Existing FL techniques often ignore the effective distinction between local models and the aggregated global model when doing the client-side weight update, as well as the distinction of local models for the server-side aggregation. In this paper, we propose a novel FL approach with resorting to mutual information (MI). Specifically, in client-side, the weight update is reformulated through minimizing the MI between local and aggregated models and employing Negative Correlation Learning (NCL) strategy. In server-side, we select top effective models for aggregation based on the MI between an individual local model and its previous aggregated model. We also theoretically prove the convergence of our algorithm. Experiments conducted on MNIST, CIFAR-10 and the clinical MIMIC-III datasets manifest that our method outperforms state-of-the-art techniques in terms of both communication and testing performance.

History

Journal

IEEE Transactions on Parallel and Distributed Systems

Volume

32

Issue

7

Pagination

1526 - 1538

Publisher

IEEE

Location

Piscataway, N.J.

ISSN

1045-9219

eISSN

2161-9883

Language

eng

Publication classification

C1 Refereed article in a scholarly journal