Skip to yearly menu bar Skip to main content


Poster

Zero-Shot Generalization of GNNs over Distinct Attribute Domains

Yangyi Shen · Jincheng Zhou · Beatrice Bevilacqua · Joshua Robinson · Charilaos Kanatsoulis · Jure Leskovec · Bruno Ribeiro

East Exhibition Hall A-B #E-2910
[ ] [ ]
Wed 16 Jul 4:30 p.m. PDT — 7 p.m. PDT

Abstract:

Traditional Graph Neural Networks (GNNs) cannot generalize to new graphs with node attributes different from the training ones, making zero-shot generalization across different node attribute domains an open challenge in graph machine learning. In this paper, we propose STAGE, which encodes statistical dependencies between attributes rather than individual attribute values, which may differ in test graphs. By assuming these dependencies remain invariant under changes in node attributes, STAGE achieves provable generalization guarantees for a family of domain shifts. Empirically, STAGE demonstrates strong zero-shot performance on medium-sized datasets: when trained on multiple graph datasets with different attribute spaces (varying in types and number) and evaluated on graphs with entirely new attributes, STAGE achieves a relative improvement in Hits@1 between 40% to 103% in link prediction and a 10% improvement in node classification compared to state-of-the-art baselines.

Lay Summary:

Traditional Graph Neural Networks can't work on new graphs with different attributes than what they were trained on. We introduce STAGE, which focuses on relationships between attributes rather than their specific values. Like recognizing that "taller people tend to buy larger sizes" is a pattern that works across different product categories. In tests, STAGE outperformed existing methods by up to 103% when predicting user purchases across different domains and by 10% when predicting user information across different social networks. This breakthrough allows AI systems to transfer knowledge between completely different domains without additional training.

Chat is not available.