To address the missing modality problem in Alzheimer's disease (AD) diagnosis, where many patients lack complete imaging data due to cost and clinical constraints, we propose a prototype-guided adaptive distillation (PGAD) framework that directly integrates incomplete multimodal data into learning. PGAD enhances the missing modality representation through prototype matching and balances learning through a dynamic sampling strategy. We validate PGAD on the ADNI dataset with various missing rates (20%, 50%, and 70%), demonstrating significant performance improvements over existing state-of-the-art approaches. Further experiments confirm the effectiveness of prototype matching and adaptive sampling, highlighting the potential of this framework for robust and scalable AD diagnosis in real-world clinical settings.