论文标题

随机订单学习

Learning with Stochastic Orders

论文作者

Domingo-Enrich, Carles, Schiff, Yair, Mroueh, Youssef

论文摘要

通过最小化积分概率指标(IPMS),学习高维分布通常是通过明确的可能性建模或隐式建模完成的。在本文中,我们将这种学习范式扩展到随机顺序,即概率度量之间的凸或choquet顺序。为此,我们利用凸顺序和最佳传输之间的关系,我们引入了概率度量之间的choquet-toland距离,这些概率度量可以用作IPMS的倒入替换。我们还介绍了各种优势标准(VDC),以学习具有优势限制的概率度量,这些概率措施编码了学到的度量和已知基线之间所需的随机顺序。我们分析了数量,并表明它们遭受了维度的诅咒,并通过输入凸Maxout网络(ICMN)提出替代物,这些网络(ICMN)享有参数率。我们提供了一个最小的最大框架,用于通过随机顺序进行学习,并在合成和高维图像产生的实验中对其进行验证,并有令人鼓舞的结果。最后,我们的ICMN类凸函数及其衍生的Rademacher复杂性超出其在凸顺序中的应用超出其应用。

Learning high-dimensional distributions is often done with explicit likelihood modeling or implicit modeling via minimizing integral probability metrics (IPMs). In this paper, we expand this learning paradigm to stochastic orders, namely, the convex or Choquet order between probability measures. Towards this end, exploiting the relation between convex orders and optimal transport, we introduce the Choquet-Toland distance between probability measures, that can be used as a drop-in replacement for IPMs. We also introduce the Variational Dominance Criterion (VDC) to learn probability measures with dominance constraints, that encode the desired stochastic order between the learned measure and a known baseline. We analyze both quantities and show that they suffer from the curse of dimensionality and propose surrogates via input convex maxout networks (ICMNs), that enjoy parametric rates. We provide a min-max framework for learning with stochastic orders and validate it experimentally on synthetic and high-dimensional image generation, with promising results. Finally, our ICMNs class of convex functions and its derived Rademacher Complexity are of independent interest beyond their application in convex orders.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源