Poster
in
Workshop: 2nd Workshop on Test-Time Adaptation: Putting Updates to the Test (PUT)
An Evidence-Based Post-Hoc Adjustment Framework for Anomaly Detection Under Data Contamination
Sukanya Patra · Souhaib Ben Taieb
Unsupervised anomaly detection (AD) methods typically assume clean training data, yet real-world datasets often contain undetected or mislabeled anomalies, leading to significant performance degradation. Existing solutions require access to the training pipelines, data or prior knowledge of the proportions of anomalies in the data, limiting real-world applicability. To address this challenge, we propose EPHAD, a simple yet effective inference-time adaptation framework that updates the outputs of AD models trained on contaminated datasets using evidence gathered at inference. Our approach formulates test-time adaptation as a Bayesian inference problem, integrating the prior knowledge captured by the AD model trained on contaminated datasets with auxiliary evidence derived from foundation models like CLIP, classical methods like the Latent Outlier Factor or domain-specific knowledge. We validate its effectiveness through extensive experiments across eight image-based AD datasets, twenty-seven tabular datasets, and a real-world industrial dataset. Our code is publicly available: https://anonymous.4open.science/r/EPAF-2025/.