Privacy-preserving and Communication-efficient Federated Learning Framework

Research output: Contribution to journalConference articlepeer-review

Abstract

We propose a privacy-preserving and communication-efficient federated learning (FL) algorithm based on ADMM. To safeguard privacy, the proposed algorithm (i) initializes and updates dual variables locally, (ii) transmits only a combined representation of primal and dual variables, preventing model inversion at the parameter server (PS) since the dual variables are not known to the PS, and (iii) integrates differential privacy (DP) and secure aggregation by leveraging random dual variables as perturbation noise that is canceled out after the aggregation step at the PS. This ensures DP per worker's model while allowing the PS to recover the quantized global model without accessing individual updates. The proposed algorithm achieves privacy with no performance loss or additional overhead, inheriting the benefits of both DP and secure aggregation. For communication efficiency, it employs stochastic quantization, while ensuring the quantization error vanishes as iterations progress. This results in a significant reduction in communication costs while maintaining the same performance as the quantization-free algorithm. Numerical experiments on convex linear regression validate its advantages over standard ADMM, and quantized ADMM.

Original languageEnglish
Pages (from-to)1140-1147
Number of pages8
JournalProcedia Computer Science
Volume257
DOIs
StatePublished - 2025
Event16th International Conference on Ambient Systems, Networks and Technologies Networks, ANT 2025 / 8th International Conference on Emerging Data and Industry 4.0, EDI40 2025 - Patras, Greece
Duration: 22 Apr 202524 Apr 2025

Bibliographical note

Publisher Copyright:
© 2025 Elsevier B.V.. All rights reserved.

Keywords

  • ADMM
  • Federated learning
  • Privacy
  • Stochastic quantization

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Privacy-preserving and Communication-efficient Federated Learning Framework'. Together they form a unique fingerprint.

Cite this