Daily Arxiv

This page organizes papers related to artificial intelligence published around the world.
This page is summarized using Google Gemini and is operated on a non-profit basis.
The copyright of the paper belongs to the author and the relevant institution. When sharing, simply cite the source.

Communicating-Efficient and Accurate Approach for Aggregation in Federated Low-Rank Adaptation

Created by
  • Haebom

Author

Le-Tuan Nguyen, Minh-Duong Nguyen, Seon-Geun Jeong, Dung D. Le, Quoc-Viet Pham

FLoRA-NA: Federated Low-Rank Aggregation with Nearly Accurate Estimation

Outline

This paper proposes Federated LoRA with Nearly Accurate Estimation (FLoRA-NA) to address the limitations of Federated Low-Rank Adaptation (FedLoRA) for fine-tuning foundation models in distributed environments. FLoRA-NA estimates aggregated matrices using local LoRA matrices on the server and distributes these to clients for local updates. This approach reduces the local-global generalization gap and high communication costs, which are major issues with existing FedLoRA methods, thereby improving communication efficiency and bridging the gap between local personalization and global generalization. Experiments on various tasks and foundation models demonstrate that FLoRA-NA achieves state-of-the-art performance while maintaining low communication costs.

Takeaways, Limitations

Takeaways:
Improving FedLoRA's efficiency: reducing the local-global generalization gap and reducing communication costs.
Achieving SOTA in various tasks: natural language understanding, mathematical reasoning, and code solving.
Achieving a balance between local personalization and global generalization.
Limitations:
There is no Limitations specified in the paper.
👍