Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Workshop on Test-Time Adaptation: Putting Updates to the Test (PUT)

Leto: Modeling Multivariate Time Series with Memorizing at Test Time

Ali Behrouz · Daniel Cao · Ali Parviz · Michele Santacatterina · Ramin Zabih

[ ] [ Project Page ]
Fri 18 Jul 11:15 a.m. PDT — noon PDT

Abstract:

Modeling multivariate time series remains a core challenge due to complex temporal and cross-variate dependencies. While sequence models like Transformers, CNNs, and RNNs have been adapted from NLP and vision tasks, they often struggle with multivariate structure, long-range dependencies, or error propagation. We introduce Leto, a 2D memory module that leverages temporal inductive bias while preserving variate permutation equivariance. By combining in-context memory with cross-variate attention, Leto effectively captures temporal patterns and inter-variate signals. Experiments across diverse benchmarks—forecasting, classification, and anomaly detection—demonstrate its strong performance.

Chat is not available.