Skip to yearly menu bar Skip to main content


Poster

Improving the Effective Receptive Field of Message-Passing Neural Networks

Shahaf E. Finder · Ron Shapira Weber · Moshe Eliasof · Oren Freifeld · Eran Treister

East Exhibition Hall A-B #E-3100
[ ] [ ]
Wed 16 Jul 11 a.m. PDT — 1:30 p.m. PDT

Abstract:

Message-Passing Neural Networks (MPNNs) have become a cornerstone for processing and analyzing graph-structured data. However, their effectiveness is often hindered by phenomena such as over-squashing, where long-range dependencies or interactions are inadequately captured and expressed in the MPNN output. This limitation mirrors the challenges of the Effective Receptive Field (ERF) in Convolutional Neural Networks (CNNs), where the theoretical receptive field is underutilized in practice. In this work, we show and theoretically explain the limited ERF problem in MPNNs. Furthermore, inspired by recent advances in ERF augmentation for CNNs, we propose an Interleaved Multiscale Message-Passing Neural Networks (IM-MPNN) architecture to address these problems in MPNNs. Our method incorporates a hierarchical coarsening of the graph, enabling message-passing across multiscale representations and facilitating long-range interactions without excessive depth or parameterization. Through extensive evaluations on benchmarks such as the Long-Range Graph Benchmark (LRGB), we demonstrate substantial improvements over baseline MPNNs in capturing long-range dependencies while maintaining computational efficiency.

Lay Summary:

Message-Passing Neural Networks (MPNNs) struggle to capture information from distant nodes due to a limited Effective Receptive Field (ERF), the region of the graph influencing a node’s representation. In this paper, we formalize the concept of ERF in GNNs and propose Interleaved Multiscale Message-Passing Neural Networks (IM-MPNNs). Our method processes graphs at multiple scales, efficiently expanding the ERF to improve long-range information integration without significantly increasing computational cost. Experiments show IM-MPNNs significantly outperform traditional GNNs in tasks requiring distant interactions, including molecular property prediction and image segmentation.

Chat is not available.