Skip to yearly menu bar Skip to main content


Poster
in
Affinity Workshop: New In ML

Bio-Inspired Neural Architecture Adaptation: Unified Dynamic Growth, Gating, and Pruning for Language Modeling and Image Classification

Yun Juei Yen


Abstract:

We present a unified approach to neural architecture adaptation that dynamically grows, prunes, and gates model connections during training, inspired by biological neural plasticity. Our method integrates self-guided capacity expansion through gating signals with sparse connectivity and learnable residual scaling, enabling networks to adjust their architecture on-the-fly for both language modeling and image classification tasks. In experiments on character-level financial news language modeling and CIFAR-100 image classification, our adaptive models achieve superior performance compared to static architectures – e.g. a validation loss of 1.28 vs 1.35 for a fixed Transformer on financial text, and 70.35% vs 66.00% test accuracy on CIFAR-100 – while maintaining high efficiency. We explicitly detail the gating mechanisms that modulate information flow, the criteria and policies for neuron growth and connection pruning, and a residual scaling technique that preserves inter-layer representation diversity. Results show that our bio-inspired adaptive strategy not only improves accuracy and generalization, but also yields insightful training dynamics such as targeted capacity allocation and sustained feature diversity.

Chat is not available.