Q-GADMM: Quantized Group ADMM for Communication Efficient Decentralized Machine Learning

Anis Elgabli*, Jihong Park, Amrit Singh Bedi, Chaouki Ben Issaid, Mehdi Bennis, Vaneet Aggarwal

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

44 Scopus citations

Abstract

In this article, we propose a communication-efficient decentralized machine learning (ML) algorithm, coined quantized group ADMM (Q-GADMM). To reduce the number of communication links, every worker in Q-GADMM communicates only with two neighbors, while updating its model via the group alternating direction method of multipliers (GADMM). Moreover, each worker transmits the quantized difference between its current model and its previously quantized model, thereby decreasing the communication payload size. However, due to the lack of centralized entity in decentralized ML, the spatial sparsity and payload compression may incur error propagation, hindering model training convergence. To overcome this, we develop a novel stochastic quantization method to adaptively adjust model quantization levels and their probabilities, while proving the convergence of Q-GADMM for convex objective functions. Furthermore, to demonstrate the feasibility of Q-GADMM for non-convex and stochastic problems, we propose quantized stochastic GADMM (Q-SGADMM) that incorporates deep neural network architectures and stochastic sampling. Simulation results corroborate that Q-GADMM significantly outperforms GADMM in terms of communication efficiency while achieving the same accuracy and convergence speed for a linear regression task. Similarly, for an image classification task using DNN, Q-SGADMM achieves significantly less total communication cost with identical accuracy and convergence speed compared to its counterpart without quantization, i.e., stochastic GADMM (SGADMM).

Original languageEnglish
Article number9205203
Pages (from-to)164-181
Number of pages18
JournalIEEE Transactions on Communications
Volume69
Issue number1
DOIs
StatePublished - Jan 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 1972-2012 IEEE.

Keywords

  • ADMM
  • Communication-efficient decentralized machine learning
  • GADMM
  • stochastic quantization

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Q-GADMM: Quantized Group ADMM for Communication Efficient Decentralized Machine Learning'. Together they form a unique fingerprint.

Cite this