Abstract
In this paper, we propose a communication-efficient alternating direction method of multipliers (ADMM)-based algorithm for solving a distributed learning problem in the stochastic non-convex setting. Our approach runs a few stochastic gradient descent (SGD) steps to solve the local problem at each worker instead of finding the exact/approximate solution as proposed by existing ADMM-based works. By doing so, the proposed framework strikes a good balance between the computation and communication costs. Extensive simulation results show that our algorithm significantly outperforms existing stochastic ADMM in terms of communication-efficiency, notably in the presence of non-independent and identically distributed (non-IID) data.
| Original language | English |
|---|---|
| Title of host publication | 2022 IEEE Wireless Communications and Networking Conference, WCNC 2022 |
| Publisher | Institute of Electrical and Electronics Engineers Inc. |
| Pages | 1880-1885 |
| Number of pages | 6 |
| ISBN (Electronic) | 9781665442664 |
| DOIs | |
| State | Published - 2022 |
| Externally published | Yes |
Publication series
| Name | IEEE Wireless Communications and Networking Conference, WCNC |
|---|---|
| Volume | 2022-April |
| ISSN (Print) | 1525-3511 |
Bibliographical note
Publisher Copyright:© 2022 IEEE.
Keywords
- Communication-efficiency
- alternating direction method of multipliers (ADMM)
- stochastic non-convex distributed optimization
ASJC Scopus subject areas
- General Engineering