This paper proposes Federated LoRA with Nearly Accurate Estimation (FLoRA-NA) to address the limitations of Federated Low-Rank Adaptation (FedLoRA) for fine-tuning foundation models in distributed environments. FLoRA-NA estimates aggregated matrices using local LoRA matrices on the server and distributes these to clients for local updates. This approach reduces the local-global generalization gap and high communication costs, which are major issues with existing FedLoRA methods, thereby improving communication efficiency and bridging the gap between local personalization and global generalization. Experiments on various tasks and foundation models demonstrate that FLoRA-NA achieves state-of-the-art performance while maintaining low communication costs.