Johan Hallberg Szabadvary: Reliable Machine Learning in Dynamic Environments (half-term seminar)
Tid: On 2026-03-04 kl 13.30 - 14.30
Plats: Albano, Cramer Room
Medverkande: Johan Hallberg Szabadvary
Abstract:
Machine learning models are increasingly being deployed in safety-critical domains; however, they often fail to provide reliable uncertainty estimates. Conformal Prediction (CP) offers a rigorous framework for valid inference, guaranteeing that prediction sets contain the true outcome with a user-specified probability. However, bridging the gap between CP theory and real-world applications requires addressing three fundamental challenges: the practicality of set-valued outputs, robustness to non-exchangeable data, and computational cost of online inference.
In this half-time seminar, I will present contributions that address these gaps. First, we formalise classification with a “reject option”, deriving the exact, distribution-free error rate for accepted predictions, resolving inconsistencies in prior heuristic literature. Second, we analyse Adaptive Conformal Inference (ACI) through the lens of control theory. We demonstrate that ACI’s finite-sample coverage guarantees of the ACI rely on feedback dynamics rather than statistical validity, permitting the use of computationally efficient Non-Conformal Confidence Predictors (NCCP) in high-frequency online settings. Third, we present online-cp, an open-source framework for real-time conformal inference and exchangeability testing.
Finally, preliminary theoretical results on the fundamental limits of distribution shift detection are presented. We identify the phenomenon of “Conformal Blindness'', proving the existence of A-cryptic change-points that are mathematically invisible to standard non-conformity measures owing to information loss in dimensionality reduction.
