This paper proposes an improvement to Online Conformal Prediction (OCP), a branch of uncertainty quantification in machine learning. Specifically, we highlight the limitations of existing OCP methods, which only consider coverage validity of prediction intervals to account for changes in data distribution over time. Instead, we propose a novel threshold update method that reflects the "relevance" of the prediction interval. This approach aims to reduce the width of the prediction interval, and we demonstrate the effectiveness of the proposed method through experiments on real-world datasets.