Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

FedSA-GCL: A Semi-Asynchronous Federated Graph Learning Framework with Personalized Aggregation and Cluster-Aware Broadcasting

Created by
  • Haebom

Author

Zhongzheng Yuan, Lianshuai Guo, Xunkai Li, Yinlin Zhu, Wenyu Wang, Meixia Qu

Outline

This paper introduces the FedSA-GCL framework, a proposed framework for Federated Graph Learning (FGL) that leverages large-scale subgraphs in distributed environments. To address the inefficiencies of existing FGL methods' synchronous communication, we adopt a semi-synchronous approach and introduce the ClusterCast mechanism, which leverages differences in label distributions across clients and graph topological features. Using the Louvain and Metis partitioning algorithms, we compare our proposed framework with nine baseline models on a real-world graph dataset, demonstrating an average performance improvement of 2.92% (Louvain) and 3.4% (Metis), along with strong robustness.

Takeaways, Limitations

Takeaways:
A novel semi-synchronous framework is presented to improve the efficiency of federated graph learning.
A ClusterCast mechanism is proposed that effectively utilizes the differences in label distribution between clients and graph topology features.
Excellent performance and robustness verified through experiments using real datasets.
Overcoming Limitations of the existing synchronous FGL method
Limitations:
Further research is needed on the scalability of the proposed method.
Need to evaluate generalization performance for various graph structures and data distributions
There is a need to explore ways to mitigate dependence on specific graph partitioning algorithms.
👍