Deakin University
Browse
qu-blockchainenabledasynchronous-2021.pdf (1.26 MB)

Blockchain-enabled asynchronous federated learning in edge computing

Download (1.26 MB)
Version 3 2024-06-19, 03:13
Version 2 2024-06-06, 06:05
Version 1 2021-05-24, 10:44
journal contribution
posted on 2024-06-19, 03:13 authored by Y Liu, Y Qu, C Xu, Z Hao, B Gu
The fast proliferation of edge computing devices brings an increasing growth of data, which directly promotes machine learning (ML) technology development. However, privacy issues during data collection for ML tasks raise extensive concerns. To solve this issue, synchronous federated learning (FL) is proposed, which enables the central servers and end devices to maintain the same ML models by only exchanging model parameters. However, the diversity of computing power and data sizes leads to a significant difference in local training data consumption, and thereby causes the inefficiency of FL. Besides, the centralized processing of FL is vulnerable to single-point failure and poisoning attacks. Motivated by this, we propose an innovative method, federated learning with asynchronous convergence (FedAC) considering a staleness coefficient, while using a blockchain network instead of the classic central server to aggregate the global model. It avoids real-world issues such as interruption by abnormal local device training failure, dedicated attacks, etc. By comparing with the baseline models, we implement the proposed method on a real-world dataset, MNIST, and achieve accuracy rates of 98.96% and 95.84% in both horizontal and vertical FL modes, respectively. Extensive evaluation results show that FedAC outperforms most existing models.

History

Journal

Sensors

Volume

21

Article number

ARTN 3335

Pagination

1 - 16

Location

Switzerland

Open access

  • Yes

ISSN

1424-8220

eISSN

1424-8220

Language

English

Publication classification

C1 Refereed article in a scholarly journal

Issue

10

Publisher

MDPI