In this paper, we develop a machine learning-based framework to accurately model the anode heel effect in Monte Carlo simulations of the __T92280_____ line imaging system. We train a multi-regression model to predict the spatial intensity variation along the anode-cathode axis using experimental weights obtained from beam measurements at various tube voltages. These weights capture the asymmetry introduced by the anode-cathode effect. We establish a systematic fine-tuning protocol to minimize the number of measurements required while maintaining model accuracy. We implement the models in OpenGATE 10 and the GGEMS Monte Carlo toolkit to evaluate their integration feasibility and predictive performance. Among the tested models, Gradient Boosting Regression (GBR) achieves the highest accuracy, with prediction errors below 5% at all energy levels. The optimized fine-tuning strategy requires only six detector positions per energy level, reducing the measurement effort by 65%. The maximum error introduced by this fine-tuning process is below 2%. Comparison of dose factors within Monte Carlo simulations demonstrates that the GBR-based model accurately replicates clinical beam profiles and significantly outperforms conventional symmetric beam models. This study presents a robust and generalizable method for incorporating anode-cathode effects into Monte Carlo simulations using machine learning. It enhances simulation realism for applications in clinical dosimetry, image quality assessment, and radiation protection by enabling accurate and energy-dependent beam modeling using limited calibration data.