This paper proposes a novel framework, MMiC, to address the problem of missing modality in multimodal federated learning (MFL), a distributed learning method that utilizes data with diverse modalities. MMiC mitigates the impact of missing modality by replacing partial parameters of client models within a cluster, optimizes client selection using the Banzhaf Power Index, and dynamically controls global aggregation using Markovitz Portfolio Optimization. Experimental results demonstrate that MMiC outperforms existing federated learning architectures in both global and personalized performance on multimodal datasets with missing modality.