Poster
Permutation Equivariant Neural Networks for Symmetric Tensors
Edward Pearce-Crump
West Exhibition Hall B2-B3 #W-805
Many scientific fields use data in the form of symmetric tensors, but current machine learning models do not fully take advantage of their natural symmetries. This limits how well models can learn from such data.We developed a new way to design models that respect these symmetries by fully characterising all linear functions between symmetric tensors that are equivariant under permutations. To make this characterisation practical, we introduced a method that represents these functions without needing large amounts of memory to store weight matrices, making it adaptable to symmetric tensors of different sizes.Having been tested on two example problems, our method learns from less data and generalises better than standard neural networks to data that can be represented in the form of symmetric tensors. This opens up new possibilities for applying machine learning to scientific domains where symmetric tensors play a central role, such as physics, chemistry, and materials science.