论文标题

使用离散几何形状重新布置用于图形神经网络训练的网络

Rewiring Networks for Graph Neural Network Training Using Discrete Geometry

论文作者

Bober, Jakub, Monod, Anthea, Saucan, Emil, Webster, Kevin N.

论文摘要

信息过度阵列是网络上遥远节点之间效率低下的信息传播的一种现象。这是一个重要的问题,已知会显着影响图形神经网络(GNN)的训练,因为节点的接受场呈指数增长。为了减轻此问题,通常将称为重新布线的预处理程序应用于输入网络。在本文中,我们研究了曲率经典几何概念的离散类似物的使用来对网络上的信息流进行建模并重新织线。我们表明,这些经典概念在各种现实世界网络数据集上实现了GNN培训准确性的最新性能。此外,与当前的最新概念相比,这些经典概念在计算运行时表现出明显的优势。

Information over-squashing is a phenomenon of inefficient information propagation between distant nodes on networks. It is an important problem that is known to significantly impact the training of graph neural networks (GNNs), as the receptive field of a node grows exponentially. To mitigate this problem, a preprocessing procedure known as rewiring is often applied to the input network. In this paper, we investigate the use of discrete analogues of classical geometric notions of curvature to model information flow on networks and rewire them. We show that these classical notions achieve state-of-the-art performance in GNN training accuracy on a variety of real-world network datasets. Moreover, compared to the current state-of-the-art, these classical notions exhibit a clear advantage in computational runtime by several orders of magnitude.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源