Communication Efficient Decentralized Learning over Bipartite Graphs

  • Chaouki Ben Issaid*
  • , Anis Elgabli
  • , Jihong Park
  • , Mehdi Bennis
  • , Merouane Debbah
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

In this paper, we propose a communication-efficiently decentralized machine learning framework that solves a consensus optimization problem defined over a network of inter-connected workers. The proposed algorithm, Censored and Quantized Generalized GADMM (CQ-GGADMM), leverages the worker grouping and decentralized learning ideas of Group Alternating Direction Method of Multipliers (GADMM), and pushes the frontier in communication efficiency by extending its applicability to generalized network topologies, while incorporating link censoring for negligible updates after quantization. We theoretically prove that CQ-GGADMM achieves the linear convergence rate when the local objective functions are strongly convex under some mild assumptions. Numerical simulations corroborate that CQ-GGADMM exhibits higher communication efficiency in terms of the number of communication rounds and transmit energy consumption without compromising the accuracy and convergence speed, compared to the censored decentralized ADMM, and the worker grouping method of GADMM.

Original languageEnglish
Pages (from-to)4150-4167
Number of pages18
JournalIEEE Transactions on Wireless Communications
Volume21
Issue number6
DOIs
StatePublished - 1 Jun 2022
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2022 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.

ASJC Scopus subject areas

  • Computer Science Applications
  • Electrical and Electronic Engineering
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Communication Efficient Decentralized Learning over Bipartite Graphs'. Together they form a unique fingerprint.

Cite this