Poster
in
Affinity Workshop: New In ML
Flatness-Aware Random Basis Combination for Exemplar-free Continual Learning
Chenrui Xiong · Lin Zhang
Continual Learning (CL) aims to train models on sequential tasks without forgetting prior knowledge. However, traditional methods often suffer from catastrophic forgetting, where performance on earlier tasks degrades when learning new ones. Recent approaches leverage pre-trained models and parameter-efficient tuning to alleviate this, but challenges remain in reducing task interference and controlling parameter growth. In this paper, we propose \textbf{Flatness-aware Random Basis Combination (FRBC)}, an exemplar-free continual learning method that combines flatness-aware optimization with random basis parameterization. FRBC restricts updates to a low-dimensional subspace spanned by fixed random bases, optimizing only the combination coefficients. This decouples the number of trainable parameters from the model size, enabling efficient scaling to many tasks. By promoting flat solutions and constraining updates within a shared subspace, FRBC effectively balances stability and plasticity. Extensive experiments on standard CL benchmarks demonstrate that FRBC consistently outperforms existing state-of-the-art methods, without relying on any stored data from previous tasks.