Skip to yearly menu bar Skip to main content


Poster

Topology-Aware Dynamic Reweighting for Distribution Shifts on Graph

Weihuang Zheng · Jiashuo Liu · Jiaxing Li · Jiayun Wu · Peng Cui · Youyong Kong

East Exhibition Hall A-B #E-2801
[ ] [ ]
Wed 16 Jul 4:30 p.m. PDT — 7 p.m. PDT

Abstract:

Graph Neural Networks (GNNs) are widely used for node classification tasks but often fail to generalize when training and test nodes come from different distributions, limiting their practicality. To address this challenge, recent approaches have adopted invariant learning and sample reweighting techniques from the out-of-distribution (OOD) generalization field. However, invariant learning-based methods face difficulties when applied to graph data, as they rely on the impractical assumption of obtaining real environment labels and strict invariance, which may not hold in real-world graph structures. Moreover, current sample reweighting methods tend to overlook topological information, potentially leading to suboptimal results. In this work, we introduce the Topology-Aware Dynamic Reweighting (TAR) framework to address distribution shifts by leveraging the inherent graph structure. TAR dynamically adjusts sample weights through gradient flow on the graph edges during training. Instead of relying on strict invariance assumptions, we theoretically prove that our method is able to provide distributional robustness, thereby enhancing the out-of-distribution generalization performance on graph data. Our framework's superiority is demonstrated through standard testing on extensive node classification OOD datasets, exhibiting marked improvements over existing methods.

Lay Summary:

GNNs struggle in real-world situations because they assume the training and testing data are similar, which isn’t always true. Some existing solutions either make unrealistic assumptions about the data or ignore the graph’s structure, leading to poor performance. In this work, we proposed a Topology-Aware Dynamic Reweighting (TAR) framework, whichtackles this by focusing on the graph’s natural connections. It adjusts the importance of different data points during training by looking at how they’re linked, making the model more adaptable to changes in data patterns.

Chat is not available.