This paper proposes Multi-Grain Knowledge Distillation (MGKD), a novel framework that integrates ex ante risk assessment and in-service default detection in financial risk management. MGKD aims to improve ex ante risk prediction performance by leveraging in-service user behavior data. It follows the concept of knowledge distillation, where a teacher model trained on in-service data supervises a student model trained on pre-trained data. It aligns the representations and predictions of the teacher and student models through multi-grain distillation strategies, including coarse-grained, fine-grained, and self-distillation. A reweighting strategy is adopted to mitigate model bias against minority classes. Experimental results using a large-scale real-world dataset from Tencent Mobile Payment demonstrate the effectiveness of the proposed approach in both offline and online environments.