This paper proposes CITRAS, a novel model that effectively utilizes covariates in time series prediction. To address the challenges of existing models, which fail to account for the length differences between future covariates and target variables and struggle to accurately capture the dependencies between target variables and covariates, CITRAS flexibly utilizes multiple targets, including future covariates, and past and future covariates based on a decoder-specific Transformer. Specifically, it introduces two novel mechanisms for patch-wise cross-variable attention: "key-value (KV) shifting" and "attention score smoothing." This mechanism seamlessly integrates future covariates into target variable prediction and captures global inter-variable dependencies while maintaining local accuracy. Experimental results demonstrate that CITRAS outperforms state-of-the-art models on 13 real-world data benchmarks.