论文标题

set2graph:从集合学习图

Set2Graph: Learning Graphs From Sets

论文作者

Serviansky, Hadar, Segol, Nimrod, Shlomi, Jonathan, Cranmer, Kyle, Gross, Eilam, Maron, Haggai, Lipman, Yaron

论文摘要

机器学习中的许多问题都可以作为学习功能从集合到图形,或更一般的超图;简而言之,set2graph函数。示例包括图形上的聚类,学习顶点和边缘功能以及集合中三重态的学习功能。构建set2graph模型的一种自然方法是表征所有线性模棱两可的设置为hypergraph层,并用非线性激活堆叠它们。这构成了两个挑战:(i)这些网络的表达能力尚未得到充分理解; (ii)这些模型将遭受高度,通常棘手的计算和记忆复杂性,因为它们的尺寸呈指数增长。本文提倡一个用于学习SET2Graph函数的神经网络模型家族,该模型既实用又具有最大表达能力(通用),即可以在紧凑的集合上近似任意的连续set2grapl函数。在不同的机器学习任务上测试这些模型,主要是粒子物理的应用,我们发现它们对现有基线有利。

Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions. Examples include clustering, learning vertex and edge features on graphs, and learning features on triplets in a collection. A natural approach for building Set2Graph models is to characterize all linear equivariant set-to-hypergraph layers and stack them with non-linear activations. This poses two challenges: (i) the expressive power of these networks is not well understood; and (ii) these models would suffer from high, often intractable computational and memory complexity, as their dimension grows exponentially. This paper advocates a family of neural network models for learning Set2Graph functions that is both practical and of maximal expressive power (universal), that is, can approximate arbitrary continuous Set2Graph functions over compact sets. Testing these models on different machine learning tasks, mainly an application to particle physics, we find them favorable to existing baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源