论文标题

不变且模棱两可的图形神经网络的表现力

Expressive Power of Invariant and Equivariant Graph Neural Networks

论文作者

Azizian, Waïss, Lelarge, Marc

论文摘要

已经提出了各种类别的图形神经网络(GNN),并证明在具有图形结构化数据的广泛应用中取得了成功。在本文中,我们提出了一个理论框架,可以比较这些GNN体系结构的表现力。当前的普遍性定理仅适用于棘手的GNN类。在这里,我们证明了实用GNN的第一个近似保证,为更好地理解其概括铺平了道路。我们的理论结果证明了计算图嵌入的不变gnns(输入图的节点的置换不影响输出),而均衡的gnns计算了节点嵌入的嵌入(输入置入输出的输出置入输出)。我们表明,基于张量的GNN增强了基于矩阵乘法的基于张量的GNN的民间传说图神经网络(FGNN)是迄今为止针对给定张量订单提出的最富有表现力的架构。我们通过证明FGNN能够学习如何解决该问题,从而说明了有关二次分配问题(NP-HARD组合问题)的结果,从而导致比现有算法(基于光谱,SDP或其他GNNS体系结构)更好的平均表现。在实用方面,我们还实施了蒙版张量,以处理各种大小的图形批处理。

Various classes of Graph Neural Networks (GNN) have been proposed and shown to be successful in a wide range of applications with graph structured data. In this paper, we propose a theoretical framework able to compare the expressive power of these GNN architectures. The current universality theorems only apply to intractable classes of GNNs. Here, we prove the first approximation guarantees for practical GNNs, paving the way for a better understanding of their generalization. Our theoretical results are proved for invariant GNNs computing a graph embedding (permutation of the nodes of the input graph does not affect the output) and equivariant GNNs computing an embedding of the nodes (permutation of the input permutes the output). We show that Folklore Graph Neural Networks (FGNN), which are tensor based GNNs augmented with matrix multiplication are the most expressive architectures proposed so far for a given tensor order. We illustrate our results on the Quadratic Assignment Problem (a NP-Hard combinatorial problem) by showing that FGNNs are able to learn how to solve the problem, leading to much better average performances than existing algorithms (based on spectral, SDP or other GNNs architectures). On a practical side, we also implement masked tensors to handle batches of graphs of varying sizes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源