To address the heterogeneous domain problem where each client has a different feature distribution in a Federated Learning (FL) environment, this paper proposes I$^2$PFL, a novel Federated Prototype Learning method that utilizes both intra-domain and inter-domain prototypes. I$^2$PFL captures diversity within the local domain and improves generalization through feature alignment using MixUp-based augmented prototypes, and introduces a reweighting mechanism for inter-domain prototypes to provide cross-domain knowledge and reduce domain bias. Experiments on the Digits, Office-10, and PACS datasets demonstrate superior performance compared to existing methods.