Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Workshop on Test-Time Adaptation: Putting Updates to the Test (PUT)

Scalable Temporal Domain Generalization via Prompting

Sepidehsadat Hosseini · Mengyao Zhai · Hossein Hajimirsadeghi · Frederick Tung

[ ] [ Project Page ]
Fri 18 Jul 11:15 a.m. PDT — noon PDT

Abstract:

Machine learning models typically assume that training and testing data follow the same independent and identically distributed (i.i.d.) distribution. However, in real-world deployment, data often evolves over time. Addressing this challenge requires models that can efficiently adapt at test time without retraining. This paper introduces a prompting-based test-time adaptation framework for temporal domain generalization that enables pre-trained models to efficiently adapt to evolving distributions without re-training. Our method is both parameter- and time-efficient, leveraging global prompts, domain-specific prompts, and drift-aware prompts to model and forecast temporal shifts in data distributions. By extrapolating these learned adaptations, our approach enables pre-trained models being adaptive to dynamic environments. We demonstrate the adaptability, scalability and generality of our framework across classification, regression, time-series forecasting, and NLP tasks, highlighting its effectiveness in adapting foundation models to real-world temporal shifts.

Chat is not available.