Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Conformal Prediction of Classifiers with Many Classes based on Noisy Labels

Created by
  • Haebom

Author

Coby Penso, Jacob Goldberger, Ethan Fetaya

Outline

This paper explores Conformal Prediction (CP), which generates a small set of predictions to control the prediction uncertainty of a classification system. CP works by defining a score based on model predictions and using a validation set to set a threshold for this score. This study addresses the calibration problem of CP when only a validation set containing noisy labels is available. We propose a method for estimating noise-free conformal thresholds based on noisy label data and derive a finite-sample coverage guarantee for uniform noise that is effective even for tasks with many classes. We call this method Noise-Aware Conformal Prediction (NACP). We demonstrate the performance of the proposed method on several standard image classification datasets.

Takeaways, Limitations

Takeaways: We present a Conformal Prediction method (NACP) that performs effectively even on noisy labeled data, increasing its applicability to real-world data. It also guarantees finite sample coverage even in multi-class classification problems.
Limitations: The presented method provides coverage guarantees for uniform noise, and its performance under other types of noise requires further study. Only experimental results on real-world datasets are presented, and the scope of theoretical analysis may be limited.
👍