MoE-CL is a parameter-efficient adversarial mixed-expert framework for continuous learning of LoRa models (LLMs) that self-evolves to adapt to dynamic data distributions in large-scale industrial environments. MoE-CL utilizes a dual-expert design, consisting of dedicated LoRA experts that preserve task-specific knowledge and shared LoRA experts that enable cross-task transfer. Through adversarial learning, the shared experts acquire generalized representations, while the dedicated experts retain task-specific details, achieving a balance between knowledge retention and cross-task generalization. Extensive experiments on the MTL5 benchmark and the industry Tencent3 benchmark demonstrate the effectiveness of MoE-CL, and in A/B testing for content compliance review on the Tencent Video platform, it reduced manual review costs by 15.3%.