Poster
in
Workshop: Methods and Opportunities at Small Scale (MOSS)
Pruning Increases Orderedness in Weight-Tied Recurrent Computation
Yiding Song
Keywords: [ perceptron ] [ directionality ] [ hierarchical organisation ] [ pruning ]
Inspired by the prevalence of recurrent circuits in biological brains, we investigate the degree to which directionality is a helpful inductive bias for artificial neural networks. Taking directionality as topologically-ordered information flow between neurons, we formalise a perceptron layer with all-to-all connections (mathematically equivalent to a weight-tied recurrent neural network) and demonstrate that directionality, a hallmark of modern feed-forward networks, can be \emph{induced} rather than hard-wired by applying appropriate pruning techniques. Across different random seeds our pruning schemes successfully induce greater topological ordering in information flow between neurons without compromising performance, suggesting that directionality is \emph{not} a prerequisite for learning, but may be an advantageous inductive bias discoverable by gradient descent and sparsification.