File(s) under permanent embargo
Mutual Information Driven Federated Learning
journal contribution
posted on 2021-07-01, 00:00 authored by Md Palash Uddin, Yong XiangYong Xiang, Xuequan LuXuequan Lu, John YearwoodJohn Yearwood, Longxiang GaoFederated Learning (FL) is an emerging research field that yields a global trained model from different local clients without violating data privacy. Existing FL techniques often ignore the effective distinction between local models and the aggregated global model when doing the client-side weight update, as well as the distinction of local models for the server-side aggregation. In this paper, we propose a novel FL approach with resorting to mutual information (MI). Specifically, in client-side, the weight update is reformulated through minimizing the MI between local and aggregated models and employing Negative Correlation Learning (NCL) strategy. In server-side, we select top effective models for aggregation based on the MI between an individual local model and its previous aggregated model. We also theoretically prove the convergence of our algorithm. Experiments conducted on MNIST, CIFAR-10 and the clinical MIMIC-III datasets manifest that our method outperforms state-of-the-art techniques in terms of both communication and testing performance.
History
Journal
IEEE Transactions on Parallel and Distributed SystemsVolume
32Issue
7Pagination
1526 - 1538Publisher
IEEELocation
Piscataway, N.J.Publisher DOI
ISSN
1045-9219eISSN
2161-9883Language
engPublication classification
C1 Refereed article in a scholarly journalUsage metrics
Read the peer-reviewed publication
Categories
Keywords
Distributed learningfederated learningparallel optimizationdata parallelisminformation theorymutual informationcommunication bottleneckdata heterogeneityScience & TechnologyTechnologyComputer Science, Theory & MethodsEngineering, Electrical & ElectronicComputer ScienceEngineeringData modelsTrainingComputational modelingServersMathematical modelConvergenceAnalytical modelsFEATURE-SELECTION