Local Stochastic ADMM for Communication-Efficient Distributed Learning

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

In this paper, we propose a communication-efficient alternating direction method of multipliers (ADMM)-based algorithm for solving a distributed learning problem in the stochastic non-convex setting. Our approach runs a few stochastic gradient descent (SGD) steps to solve the local problem at each worker instead of finding the exact/approximate solution as proposed by existing ADMM-based works. By doing so, the proposed framework strikes a good balance between the computation and communication costs. Extensive simulation results show that our algorithm significantly outperforms existing stochastic ADMM in terms of communication-efficiency, notably in the presence of non-independent and identically distributed (non-IID) data.

Original languageEnglish
Title of host publication2022 IEEE Wireless Communications and Networking Conference, WCNC 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1880-1885
Number of pages6
ISBN (Electronic)9781665442664
DOIs
StatePublished - 2022
Externally publishedYes

Publication series

NameIEEE Wireless Communications and Networking Conference, WCNC
Volume2022-April
ISSN (Print)1525-3511

Bibliographical note

Publisher Copyright:
© 2022 IEEE.

Keywords

  • Communication-efficiency
  • alternating direction method of multipliers (ADMM)
  • stochastic non-convex distributed optimization

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Local Stochastic ADMM for Communication-Efficient Distributed Learning'. Together they form a unique fingerprint.

Cite this