Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Large-Scale Model Enabled Semantic Communication Based on Robust Knowledge Distillation

Created by
  • Haebom

Author

Kuiyuan Ding, Caili Guo, Yang Yang, Zhongtian Du, Walid Saad

Outline

This paper proposes Robust Knowledge Distillation-based Semantic Communication (RKD-SC), a novel framework for designing large-scale model (LSM)-based semantic communication (SC) systems that is efficient and robust to channel noise. RKD-SC utilizes the Knowledge Distillation-based Lightweight Differentiable Architecture Search (KDL-DARTS) algorithm and the two-stage Robust Knowledge Distillation (RKD) algorithm to reduce model size and enhance robustness to channel noise while maintaining the performance of LSM. Furthermore, it introduces a Channel-Aware Transformer (CAT) block trained with variable-length outputs under various channel conditions to enhance resilience to channel impairments. Extensive simulations on image classification tasks demonstrate that the RKD-SC framework significantly reduces model parameters while maintaining high performance and demonstrating superior robustness compared to existing methods.

Takeaways, Limitations

Takeaways:
We present an efficient semantic communication framework that addresses the high computational complexity and resource requirements of large-scale models.
We present a lightweight model design based on knowledge distillation and a model learning method robust to channel noise.
Improved robustness against channel impairments through channel-aware transformer blocks.
Demonstrated superior performance and robustness compared to existing methods in image classification tasks.
Limitations:
The effectiveness of the proposed method is limited to simulation results for image classification tasks. Performance verification in other tasks or real-world environments is required.
Further research is needed on the optimal parameter settings of the KDL-DARTS algorithm and the RKD algorithm.
Lack of detailed explanation of the design and optimization of CAT blocks.
There may be a lack of precise definitions and details for various channel conditions.
👍