Skip to yearly menu bar Skip to main content


Poster

Fast and Low-Cost Genomic Foundation Models via Outlier Removal

Haozheng Luo · Chenghao Qiu · Maojiang Su · Zhihan Zhou · Zoe Mehta · Guo Ye · Jerry Yao-Chieh Hu · Han Liu

West Exhibition Hall B2-B3 #W-506
[ ] [ ] [ Project Page ]
Tue 15 Jul 4:30 p.m. PDT — 7 p.m. PDT

Abstract:

To address the challenge of scarce computational resources in genomic modeling, we introduce GERM, a genomic foundation model optimized for accessibility and adaptability. GERM improves upon models like DNABERT-2 by eliminating outliers that hinder low-rank adaptation and post-training quantization, enhancing both efficiency and robustness. We replace the vanilla attention layer with an outlier-free mechanism inspired by associative memory models. By removing outliers during both pre-training and fine-tuning, this approach accelerates adaptation, reduces computational costs, and enhances quantization robustness within acceptable loss margins. Additionally, we propose GERM-T, a strategy that employs small-step continual learning within the outlier-free framework, leveraging original checkpoints to avoid retraining from scratch. Empirically, GERM improves fine-tuning performance by 37.98% and quantization by 64.34% over the baseline model. It also reduces average kurtosis by 92.14% and maximum infinity norm by 82.77%. Compared to leading methods, GERM consistently delivers superior performance, offering a practical solution for genomic modeling in resource-constrained settings.

Lay Summary:

Large AI models for genomic analysis often require significant computing resources, making them difficult to use in resource-constrained environments. To address this, we introduce GERM, a genomic foundation model optimized for accessibility and adaptability. GERM improves upon models like DNABERT-2 by removing outliers that impede low-rank adaptation and post-training quantization, thereby improving efficiency and robustness. It replaces the standard attention layer with an outlier-free mechanism inspired by associative memory models, enabling faster adaptation, lower computational cost, and improved quantization robustness with minimal performance loss. We also propose GERM-T, a continual learning strategy that supports small-step updates using existing checkpoints, avoiding the need for full retraining.Our paper reveals that GERM improves fine-tuning performance by 37.98% and quantization by 64.34% over the baseline. Moreover, it effectively suppresses outlier indicators, achieving a 92.14% reduction in average kurtosis and an 82.77% decrease in the maximum infinity norm. These results make GERM and GERM-T practical tools for genomic modeling in resource-constrained settings.

Chat is not available.