Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models

Created by
  • Haebom

Author

Kai Yi, Georg Meinhardt, Laurent Condat, Peter Richt arik

Outline

This paper proposes FedComLoc, a novel algorithm based on the Scaffnew algorithm, to address the communication cost issue in Federated Learning (FL). Building on the strengths of Scaffnew, FedComLoc further enhances communication efficiency by incorporating effective compression techniques such as TopK compression and quantization. Experimental results demonstrate that this method significantly reduces communication overhead in heterogeneous environments.

Takeaways, Limitations

Takeaways:
A new method is presented to further improve the efficiency of the Scaffnew algorithm.
A practical communication cost reduction method using TopK compression and quantization is presented.
Contributes to improving federated learning performance in heterogeneous environments
Limitations:
Dependence of the proposed algorithm on a specific compression technique (TopK)
Further experiments with various compression techniques and data distributions are needed.
Additional validation is needed for real-world application.
👍