To address the challenges of generating large-scale, domain-specific multilingual conversation datasets, this paper presents Chain-of-Intent, a novel framework that integrates Hidden Markov Models (HMMs) and Large-Scale Language Models (LLMs). Chain-of-Intent extracts domain-specific intent transition patterns from real-world e-commerce chat logs and leverages them to model round-by-round dynamics and intent sequences. It then parameterizes the HMM's emission probabilities using LLMs to generate natural and consistent utterances that align with the predicted intent and conversation context. Furthermore, we propose MINT-CL, a multi-task contrastive learning framework that improves performance while reducing reliance on large annotated datasets. Experimental results demonstrate that the proposed method outperforms competing baseline models in both dialogue generation quality and classification accuracy, particularly in multilingual environments. Finally, we release MINT-E, a comprehensive multilingual, intent-aware multi-round conversation corpus derived from the e-commerce domain, for future research.