This paper proposes a novel approach to address the problem of model collapse caused by repeated training on synthetic data, given the projection that most training data will be machine-generated synthetic data by 2030. We identify overconfidence in the model's own generated data as a primary cause of model collapse and propose Truncated Cross Entropy (TCE), a confidence-aware loss function that downweights high-confidence predictions. Theoretical and experimental analyses demonstrate that TCE extends the performance retention period before model collapse by more than 2.3 times, demonstrating its generalizability across various modalities. In conclusion, our loss function design suggests a simple yet powerful tool for maintaining the quality of generated models in the era of synthetic data.