Skip to yearly menu bar Skip to main content


Oral
in
Workshop: 2nd Workshop on Test-Time Adaptation: Putting Updates to the Test (PUT)

Oral Talk 5: Leto: Modeling Multivariate Time Series with Memorizing at Test Time

Daniel Cao

[ ] [ Project Page ]
Fri 18 Jul 4 p.m. PDT — 4:15 p.m. PDT

Abstract:

Modeling multivariate time series remains a core challenge due to complex temporal and cross-variate dependencies. While sequence models like Transformers, CNNs, and RNNs have been adapted from NLP and vision tasks, they often struggle with multivariate structure, long-range dependencies, or error propagation. We introduce Leto, a 2D memory module that leverages temporal inductive bias while preserving variate permutation equivariance. By combining in-context memory with cross-variate attention, Leto effectively captures temporal patterns and inter-variate signals. Experiments across diverse benchmarks—forecasting, classification, and anomaly detection—demonstrate its strong performance.

Chat is not available.