Daily Arxiv

This page organizes papers related to artificial intelligence published around the world.
This page is summarized using Google Gemini and is operated on a non-profit basis.
The copyright of the paper belongs to the author and the relevant institution. When sharing, simply cite the source.

Relevance-Aware Thresholding in Online Conformal Prediction for Time Series

Created by
  • Haebom

Author

Théo Dupuy, Binbin Xu, St ephane Perrey, Jacky Montmain, Abdelhak Imoussaten

Outline

This paper proposes an improvement to Online Conformal Prediction (OCP), a branch of uncertainty quantification in machine learning. Specifically, we highlight the limitations of existing OCP methods, which only consider coverage validity of prediction intervals to account for changes in data distribution over time. Instead, we propose a novel threshold update method that reflects the "relevance" of the prediction interval. This approach aims to reduce the width of the prediction interval, and we demonstrate the effectiveness of the proposed method through experiments on real-world datasets.

Takeaways, Limitations

Takeaways:
Improves prediction performance by considering the relevance of prediction intervals, which was overlooked in existing OCP methodologies.
Prevents abrupt changes in the prediction interval during the threshold update process, allowing for narrower prediction intervals.
The validity of the proposed method is demonstrated through experiments on real datasets.
Limitations:
There may be a lack of detailed descriptions of specific relevance functions or mention of function selection criteria.
Further research may be needed to determine the generalizability of the proposed method and its applicability to other types of time series data.
Consideration may need to be given to the potential increase in computational complexity.
👍