Skip to yearly menu bar Skip to main content


Poster

LightGTS: A Lightweight General Time Series Forecasting Model

Yihang Wang · Yuying Qiu · Peng Chen · Yang Shu · Zhongwen Rao · Lujia Pan · Bin Yang · Chenjuan Guo

East Exhibition Hall A-B #E-2111
[ ] [ ]
Wed 16 Jul 11 a.m. PDT — 1:30 p.m. PDT

Abstract:

Existing works on general time series forecasting build foundation models with heavy model parameters through large-scale multi-source pretraining. These models achieve superior generalization ability across various datasets at the cost of significant computational burdens and limitations in resource-constrained scenarios. This paper introduces LightGTS, a lightweight general time series forecasting model designed from the perspective of consistent periodical modeling. To handle diverse scales and intrinsic periods in multi-source pre-training, we introduce Periodical Tokenization, which extracts consistent periodic patterns across different datasets with varying scales. To better utilize the periodicity in the decoding process, we further introduce Periodical Parallel Decoding, which leverage historical tokens to improve forecasting. Based on the two techniques above which fully leverage the inductive bias of periods inherent in time series, LightGTS uses a lightweight model to achieve outstanding performance on general time series forecasting. It achieves state-of-the-art forecasting performance on 9 real-world benchmarks in both zero-shot and full-shot setting with much better efficiency compared with existing time series foundation models

Lay Summary:

Existing approaches to general time series forecasting often rely on large-scale data and complex multi-source pre-training, resulting in models with massive parameters and high computational costs—making them impractical in resource-constrained settings. To address this, we introduce LightGTS, a lightweight general time series forecasting model designed around consistent periodical modeling. To handle diverse scales and intrinsic periodicities in multi-source pre-training, we propose Periodical Tokenization, which extracts consistent periodic patterns across datasets of varying sizes. For improved prediction during decoding, we further develop Periodical Parallel Decoding, leveraging historical information to enhance forecasting accuracy. By fully harnessing the inherent periodic inductive bias of time series through these two techniques, LightGTS achieves exceptional performance with a lightweight architecture. It delivers state-of-the-art forecasting results on 9 real-world benchmarks under both zero-shot and full-shot settings, while significantly outperforming existing time series foundation models in efficiency.

Chat is not available.