In this paper, we propose a novel framework, MMiC, to address the missing modality problem in multimodal federated learning (MFL), a distributed learning method that uses data of various modalities. MMiC mitigates the impact of missing modalities by replacing partial parameters of client models within a cluster, optimizes client selection using the Banzhaf Power Index, and dynamically controls global aggregation using Markovitz Portfolio Optimization. Experimental results show that MMiC outperforms existing federated learning architectures in both global and personalized performance on multimodal datasets with missing modalities.