This paper proposes Robust Knowledge Distillation-based Semantic Communication (RKD-SC), a novel framework for designing large-scale model (LSM)-based semantic communication (SC) systems that is efficient and robust to channel noise. RKD-SC utilizes the Knowledge Distillation-based Lightweight Differentiable Architecture Search (KDL-DARTS) algorithm and the two-stage Robust Knowledge Distillation (RKD) algorithm to reduce model size and enhance robustness to channel noise while maintaining the performance of LSM. Furthermore, it introduces a Channel-Aware Transformer (CAT) block trained with variable-length outputs under various channel conditions to enhance resilience to channel impairments. Extensive simulations on image classification tasks demonstrate that the RKD-SC framework significantly reduces model parameters while maintaining high performance and demonstrating superior robustness compared to existing methods.