Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Workshop on Test-Time Adaptation: Putting Updates to the Test (PUT)

An Evidence-Based Post-Hoc Adjustment Framework for Anomaly Detection Under Data Contamination

Sukanya Patra · Souhaib Ben Taieb

[ ] [ Project Page ]
Fri 18 Jul 2:30 p.m. PDT — 3:15 p.m. PDT

Abstract:

Unsupervised anomaly detection (AD) methods typically assume clean training data, yet real-world datasets often contain undetected or mislabeled anomalies, leading to significant performance degradation. Existing solutions require access to the training pipelines, data or prior knowledge of the proportions of anomalies in the data, limiting real-world applicability. To address this challenge, we propose EPHAD, a simple yet effective inference-time adaptation framework that updates the outputs of AD models trained on contaminated datasets using evidence gathered at inference. Our approach formulates test-time adaptation as a Bayesian inference problem, integrating the prior knowledge captured by the AD model trained on contaminated datasets with auxiliary evidence derived from foundation models like CLIP, classical methods like the Latent Outlier Factor or domain-specific knowledge. We validate its effectiveness through extensive experiments across eight image-based AD datasets, twenty-seven tabular datasets, and a real-world industrial dataset. Our code is publicly available: https://anonymous.4open.science/r/EPAF-2025/.

Chat is not available.