[공지사항]을 빙자한 안부와 근황 
Show more

Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Scaling Decentralized Learning with FLock

Created by
  • Haebom

Author

Zehua Cheng, Rui Sun, Jiahao Sun, Yike Guo

Outline

In this paper, we propose FLock, a decentralized framework for secure and efficient collaborative fine-tuning of large-scale language models (LLMs) in distributed environments. While traditional federated learning (FL) suffers from a single point-of-attack vulnerability of a central server, FLock provides a secure and auditable collaboration protocol among untrusted parties by incorporating a blockchain-based trust layer and economic incentives. In this paper, we present the first experimental validation results of fine-tuning a 70B-parameter LLM in a secure and multi-domain decentralized environment, and show that the FLock framework defends against backdoor attacks that corrupt the standard FL optimizer and promotes synergistic knowledge transfer. The resulting model reduces the adversarial attack rate by more than 68% and exhibits better cross-domain generalization performance than independently trained models.

Takeaways, Limitations

Takeaways:
Presenting a feasible framework for secure and efficient decentralized fine-tuning of 70B parameter LLM
Implementing a secure and auditable collaboration protocol with blockchain-based trust layers and economic incentives
Prevents backdoor attacks and promotes knowledge transfer that creates synergy effects
Achieving excellent cross-domain generalization performance
Limitations:
Experimental validation of the 70B parameter model may be limited. Additional experiments on models of various scales are needed.
Scalability and transaction cost issues of blockchain-based systems.
Further research is needed on applicability and stability in various real-world environments.
👍