Poster
in
Workshop: The 2nd Workshop on Reliable and Responsible Foundation Models
Conformal Risk Minimization with Variance Reduction
Sima Noorani · Orlando Romero · Nicolo Dal Fabbro · Hamed Hassani · George Pappas
Keywords: [ reliable machine learning ] [ conformal risk minimization ] [ Uncertainty quantification ] [ Conformal Prediciton ] [ length efficiency in conformal prediction ]
Conformal prediction (CP) is a distribution-free framework for achieving probabilistic guarantees on black-box models. CP is generally applied to a model post-training. Recent research efforts, on the other hand, have focused on optimizing CP efficiency during training. We formalize this concept as the problem of conformal risk minimization (CRM). In this direction, conformal training (ConfTr) by Stutz et al. (2022) is a CRM technique that seeks to minimize the expected prediction set size of a model by simulating CP in-between training updates. In this paper, we provide a novel analysis for the ConfTr gradient estimation method, revealing a strong source of sample inefficiency that introduces training instability and limits its practical use. To address this challenge, we propose variance-reduced conformal training (VR-ConfTr), a CRM method that carefully incorporates a novel variance reduction technique in the gradient estimation of the ConfTr objective function. Through extensive experiments on various benchmark datasets, we demonstrate that VR-ConfTr consistently achieves faster convergence and smaller prediction sets compared to baselines