In this paper, we propose DualSG, a novel dual-stream framework for multivariate time series forecasting (MTSF). This framework utilizes a large-scale language model (LLM) as a semantic guidance module that complements existing prediction models, rather than as a standalone predictor. DualSG provides interpretable context to the LLM using an explicit prompt format called "Time Series Caption," which summarizes time series patterns in natural language. To address the numerical precision degradation of existing methods, the processing of patterns beyond the LLM's design intent, and the difficulty of modality alignment in the latent space, the LLM is designed to enhance existing prediction results. Experimental results using real-world datasets from various fields show that DualSG consistently outperforms 15 state-of-the-art baseline models.