论文标题
知识增强了关系领域的神经网络
Knowledge Enhanced Neural Networks for relational domains
论文作者
论文摘要
在最近的过去,人们对神经符号融合框架的兴趣越来越大,即集成连接主义者和符号方法以获得两全其美的混合系统。在这项工作中,我们着重于一种特定方法,肯恩(知识增强了神经网络),这是一种神经符号结构,通过在其顶部添加其顶部残留层,将先前的逻辑知识注入神经网络,从而将初始预测相应地根据知识修改。在此策略的优点中,有子句权重,代表条款强度的可学习参数,这意味着模型可以学习每个规则对最终预测的影响。作为一种特殊情况,如果培训数据与约束矛盾,肯恩学会了忽略它,从而使系统强大地了解了错误的知识。在本文中,我们提出了kenn的扩展,以进行关系数据。由于通过堆叠多个逻辑层获得的规则之间对依赖关系的灵活处理,肯恩的主要优点之一就在于其可扩展性。我们通过实验表明该策略的功效。结果表明,Kenn能够增加基本神经网络的性能,相对于将学习与逻辑相结合的其他两种相关方法获得更好或可比的精度,需要更少的学习时间。
In the recent past, there has been a growing interest in Neural-Symbolic Integration frameworks, i.e., hybrid systems that integrate connectionist and symbolic approaches to obtain the best of both worlds. In this work we focus on a specific method, KENN (Knowledge Enhanced Neural Networks), a Neural-Symbolic architecture that injects prior logical knowledge into a neural network by adding on its top a residual layer that modifies the initial predictions accordingly to the knowledge. Among the advantages of this strategy, there is the inclusion of clause weights, learnable parameters that represent the strength of the clauses, meaning that the model can learn the impact of each rule on the final predictions. As a special case, if the training data contradicts a constraint, KENN learns to ignore it, making the system robust to the presence of wrong knowledge. In this paper, we propose an extension of KENN for relational data. One of the main advantages of KENN resides in its scalability, thanks to a flexible treatment of dependencies between the rules obtained by stacking multiple logical layers. We show experimentally the efficacy of this strategy. The results show that KENN is capable of increasing the performances of the underlying neural network, obtaining better or comparable accuracies in respect to other two related methods that combine learning with logic, requiring significantly less time for learning.