论文标题

关系边缘多面关系的域更高

Domain-Liftability of Relational Marginal Polytopes

论文作者

Kuzelka, Ondrej, Wang, Yuyi

论文摘要

我们研究了关系边缘多面关系的计算方面,这些方面是边缘多型的统计关系学习对应物,这是从概率图形模型中众所周知的。在这里,鉴于一些一阶逻辑公式,我们可以将其关系边缘统计量定义为使该公式在给定的世界中成为真实的基础的比例。对于一阶逻辑公式的列表,关系边缘多层是所有点的集合,与可实现的关系边缘统计的预期值相对应。在本文中,我们研究了以下两个问题:(i)马尔可夫逻辑网络(MLN)的分区函数的域卸义结果是否会解决关系边缘多层建设的问题? (ii)在某些合理的复杂性理论假设下,关系边缘多层遏制问题是否严重?我们的积极成果对增加了MLN的体重学习产生了影响。特别是,我们表明,每当相应MLN的分区函数的计算都是可容纳的域时,MLN的体重学习是可以限制的(以前尚未严格证明此结果)。

We study computational aspects of relational marginal polytopes which are statistical relational learning counterparts of marginal polytopes, well-known from probabilistic graphical models. Here, given some first-order logic formula, we can define its relational marginal statistic to be the fraction of groundings that make this formula true in a given possible world. For a list of first-order logic formulas, the relational marginal polytope is the set of all points that correspond to the expected values of the relational marginal statistics that are realizable. In this paper, we study the following two problems: (i) Do domain-liftability results for the partition functions of Markov logic networks (MLNs) carry over to the problem of relational marginal polytope construction? (ii) Is the relational marginal polytope containment problem hard under some plausible complexity-theoretic assumptions? Our positive results have consequences for lifted weight learning of MLNs. In particular, we show that weight learning of MLNs is domain-liftable whenever the computation of the partition function of the respective MLNs is domain-liftable (this result has not been rigorously proven before).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源