This paper presents the development of a novel time-series outlier detection model in a zero-shot environment. To overcome the limitations of existing reconstruction-based models, we propose a new model, \texttt{TimeRCD}. Instead of reconstructing inputs, \texttt{TimeRCD} is based on relative context discrepancy (RCD), which learns to detect discrepancies between adjacent time windows. Using a Transformer architecture, it captures contextual changes that indicate outliers. For effective pretraining, we constructed a large-scale synthetic dataset containing token-level outlier labels. Our model outperforms existing models in zero-shot time-series outlier detection.