论文标题

在对数损失下通过自我纠纷的微小遗憾的紧密界限

Tight Bounds on Minimax Regret under Logarithmic Loss via Self-Concordance

论文作者

Bilodeau, Blair, Foster, Dylan J., Roy, Daniel M.

论文摘要

我们考虑在对数损失下进行顺序概率分配的经典问题,同时与任意的,潜在的非参数专家类竞争。我们通过一种新的方法来利用对数损失的自信性属性的新方法,从而获得了最小值的遗憾。我们表明,对于任何具有(顺序的)度量熵$ \ MATHCAL {o}(γ^{ - p})$的专家课程,在$γ$上,minimax遗憾是$ \ nathcal {o}(n^{p/(p/(p+1)})$,如果没有其他对专家的考虑,就无法提高此费率。为了应用我们的技术,我们解决了非参数Lipschitz专家类的Minimax遗憾。

We consider the classical problem of sequential probability assignment under logarithmic loss while competing against an arbitrary, potentially nonparametric class of experts. We obtain tight bounds on the minimax regret via a new approach that exploits the self-concordance property of the logarithmic loss. We show that for any expert class with (sequential) metric entropy $\mathcal{O}(γ^{-p})$ at scale $γ$, the minimax regret is $\mathcal{O}(n^{p/(p+1)})$, and that this rate cannot be improved without additional assumptions on the expert class under consideration. As an application of our techniques, we resolve the minimax regret for nonparametric Lipschitz classes of experts.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源