Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/173329
Title: | Union subgraph neural networks | Authors: | Xu, Jiaxing Zhang, Aihu Bian, Qingtian Dwivedi, Vijay Prakash Ke, Yiping |
Keywords: | Computer and Information Science | Issue Date: | 2024 | Source: | Xu, J., Zhang, A., Bian, Q., Dwivedi, V. P. & Ke, Y. (2024). Union subgraph neural networks. The 38th AAAI Conference on Artificial Intelligence (AAAI 2024), 38, 16173-16183. https://dx.doi.org/10.1609/aaai.v38i14.29551 | Project: | IAF-PP MOE-T2EP20220-0006 |
Conference: | The 38th AAAI Conference on Artificial Intelligence (AAAI 2024) | Abstract: | Graph Neural Networks (GNNs) are widely used for graph representation learning in many application domains. The expressiveness of vanilla GNNs is upper-bounded by 1-dimensional Weisfeiler-Leman (1-WL) test as they operate on rooted subtrees through iterative message passing. In this paper, we empower GNNs by injecting neighbor-connectivity information extracted from a new type of substructure. We first investigate different kinds of connectivities existing in a local neighborhood and identify a substructure called union subgraph, which is able to capture the complete picture of the 1-hop neighborhood of an edge. We then design a shortest-path-based substructure descriptor that possesses three nice properties and can effectively encode the high-order connectivities in union subgraphs. By infusing the encoded neighbor connectivities, we propose a novel model, namely Union Subgraph Neural Network (UnionSNN), which is proven to be strictly more powerful than 1-WL in distinguishing non-isomorphic graphs. Additionally, the local encoding from union subgraphs can also be injected into arbitrary message-passing neural networks (MPNNs) and Transformer-based models as a plugin. Extensive experiments on 18 benchmarks of both graph-level and node-level tasks demonstrate that UnionSNN outperforms state-of-the-art baseline models, with competitive computational efficiency. The injection of our local encoding to existing models is able to boost the performance by up to 11.09%. Our code is available at https://github.com/AngusMonroe/UnionSNN. | URI: | https://hdl.handle.net/10356/173329 | URL: | https://ojs.aaai.org/index.php/AAAI/article/view/29551 | DOI: | 10.1609/aaai.v38i14.29551 | Schools: | School of Computer Science and Engineering | Research Centres: | Computational Intelligence Lab (CIL) | Rights: | © 2024 Association for the Advancement of Artifcial Intelligence. All rights reserved. This article may be downloaded for personal use only. Any other use requires prior permission of the copyright holder. The Version of Record is available online at https://doi.org/10.1609/aaai.v38i14.29551. | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Conference Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
AAAI2024_Union_Subgraph_Neural_Networks.pdf | 727.72 kB | Adobe PDF | View/Open |
Page view(s)
98
Updated on Sep 14, 2024
Download(s)
7
Updated on Sep 14, 2024
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.