Daily Arxiv

This page organizes papers related to artificial intelligence published around the world.
This page is summarized using Google Gemini and is operated on a non-profit basis.
The copyright of the paper belongs to the author and the relevant institution. When sharing, simply cite the source.

Towards Foundation Models for Zero-Shot Time Series Anomaly Detection: Leveraging Synthetic Data and Relative Context Discrepancy

Created by
  • Haebom

Author

Tian Lan, Hao Duong Le, Jinbo Li, Wenjun He, Meng Wang, Chenghao Liu, Chen Zhang

TimeRCD: A Foundation Model for Time Series Anomaly Detection with Relative Context Discrepancy

Outline

This paper presents the development of a novel time-series outlier detection model in a zero-shot environment. To overcome the limitations of existing reconstruction-based models, we propose a new model, \texttt{TimeRCD}. Instead of reconstructing inputs, \texttt{TimeRCD} is based on relative context discrepancy (RCD), which learns to detect discrepancies between adjacent time windows. Using a Transformer architecture, it captures contextual changes that indicate outliers. For effective pretraining, we constructed a large-scale synthetic dataset containing token-level outlier labels. Our model outperforms existing models in zero-shot time-series outlier detection.

Takeaways, Limitations

Takeaways:
We overcome the limitations of the reconstruction method and propose a new outlier detection paradigm based on RCD.
We demonstrate the potential of a generalized outlier detection model by improving zero-shot performance on diverse datasets.
By building a large-scale synthetic dataset, we lay the foundation for effective pre-training.
Limitations:
No specific mention of Limitations in the paper.
👍