This paper presents the results of a data-driven analysis of hidden license conflicts in the open-source AI ecosystem. We conducted a comprehensive license compliance audit of 360,000 datasets, 1.6 million models, and 140,000 GitHub projects from Hugging Face. We discovered systematic noncompliance during the transition from model to application, with 35.5% of licenses ignoring restrictive license terms and relicensing with permissive licenses. Furthermore, we developed an extensible rules engine encompassing approximately 200 SPDX and model-specific clauses, demonstrating that it can resolve 86.4% of license conflicts in software applications. We release the dataset and prototype engine, along with our findings, to support future research.