论文标题

传递神经网络的消息真的有助于知识图完成吗?

Are Message Passing Neural Networks Really Helpful for Knowledge Graph Completion?

论文作者

Li, Juanhui, Shomer, Harry, Ding, Jiayuan, Wang, Yiqi, Ma, Yao, Shah, Neil, Tang, Jiliang, Yin, Dawei

论文摘要

知识图(kgs)促进了多种应用。尽管在创造和维护方面做出了巨大努力,但即使是最大的公斤也远非完整。因此,KG完成(KGC)已成为KG研究最关键的任务之一。最近,该领域的大量文献围绕使用消息传递(图)神经网络(MPNN)的使用,以学习强大的嵌入。这些方法的成功自然归因于MPNN在更简单的多层感知器(MLP)模型上使用了其他消息传递(MP)组件。在这项工作中,我们发现简单的MLP模型能够达到与MPNN的可比性能,这表明MP可能不像以前那样重要。通过进一步的探索,我们表现出仔细的评分功能和损失功能设计对KGC模型性能的影响更大。这表明在先前的工作中,评分功能设计,损失功能设计和MP的混合以及有关当今最先进的KGC方法可扩展性的有希望的见解,以及明天明天针对KGC任务更合适的MP设计的仔细注意。我们的代码可在以下网址公开获取:https://github.com/juanhui28/are_mpnns_helpful。

Knowledge graphs (KGs) facilitate a wide variety of applications. Despite great efforts in creation and maintenance, even the largest KGs are far from complete. Hence, KG completion (KGC) has become one of the most crucial tasks for KG research. Recently, considerable literature in this space has centered around the use of Message Passing (Graph) Neural Networks (MPNNs), to learn powerful embeddings. The success of these methods is naturally attributed to the use of MPNNs over simpler multi-layer perceptron (MLP) models, given their additional message passing (MP) component. In this work, we find that surprisingly, simple MLP models are able to achieve comparable performance to MPNNs, suggesting that MP may not be as crucial as previously believed. With further exploration, we show careful scoring function and loss function design has a much stronger influence on KGC model performance. This suggests a conflation of scoring function design, loss function design, and MP in prior work, with promising insights regarding the scalability of state-of-the-art KGC methods today, as well as careful attention to more suitable MP designs for KGC tasks tomorrow. Our codes are publicly available at: https://github.com/Juanhui28/Are_MPNNs_helpful.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源