Poster
Chaos Meets Attention: Transformers for Large-Scale Dynamical Prediction
Yi He · Yiming Yang · Xiaoyuan Cheng · Hai Wang · Xiao Xue · Boli Chen · Yukun Hu
West Exhibition Hall B2-B3 #W-114
Chaotic systems are widely known for the “butterfly effect,” where tiny changes can lead to vastly different outcomes. Predicting their long-term behavior is extremely hard, especially when the system has many moving parts (high dimensions) and fine details (high resolution). While chaos seems unpredictable, these systems often follow hidden patterns over time.Our research tackles the challenge of simulating these systems by combining physics insights with multi-stage accelerations for scaling. We designed a new transformer-based model—similar to those powering language and vision models— we innovate the attention mechanism and training methods, creating a faster, more robust way to generate long-term chaotic dynamics.We also introduce new benchmarks designed specifically for early-stage machine learning research on chaotic systems to the community. Our goal is to make large-scale chaos a little more predictable and accelerate the related machine learning research and the applications.