论文标题

通过大地核的简单校准

Simple Calibration via Geodesic Kernels

论文作者

Dey, Jayanta, Xu, Haoyin, De Silva, Ashwin, Vogelstein, Joshua T.

论文摘要

深层歧视方法,例如决策林和深层神经网络,最近在许多重要的现实世界中发现了应用。但是,在安全至关重要的应用中部署这些学习算法会引起人们的关注,尤其是在确保分发和分布区域的校准方面。许多流行的用于分布(ID)校准的方法,例如等距和Platt的Sigmoidal回归,具有足够的ID校准性能。但是,这些方法未在整个特征空间中校准,从而导致分布外(OOD)区域过度自信。现有的OOD校准方法通常表现出不良的ID校准。在本文中,我们共同解决了ID和OOD问题。我们利用了一个事实,即深层模型学习将空间分配到多面体的结合,即平坦的几何对象。我们引入了一个测量距离,以测量这些多面体之间的距离,并使用高斯内核进一步区分同一多层的样品。我们对表格和视觉基准测试的实验表明,提出的方法,即内核密度森林(KDF)和内核密度网络(KDN),为ID和OOD样品获得了良好的校准后代,同时,大多数人将分类的准确性和伸出培训数据超出训练数据以外的培训数据以对OOD批准。

Deep discriminative approaches, such as decision forests and deep neural networks, have recently found applications in many important real-world scenarios. However, deploying these learning algorithms in safety-critical applications raises concerns, particularly when it comes to ensuring calibration for both in-distribution and out-of-distribution regions. Many popular methods for in-distribution (ID) calibration, such as isotonic and Platt's sigmoidal regression, exhibit adequate ID calibration performance. However, these methods are not calibrated for the entire feature space, leading to overconfidence in the out-of-distribution (OOD) region. Existing OOD calibration methods generally exhibit poor ID calibration. In this paper, we jointly address the ID and OOD problems. We leveraged the fact that deep models learn to partition feature space into a union of polytopes, that is, flat-sided geometric objects. We introduce a geodesic distance to measure the distance between these polytopes and further distinguish samples within the same polytope using a Gaussian kernel. Our experiments on both tabular and vision benchmarks show that the proposed approaches, namely Kernel Density Forest (KDF) and Kernel Density Network (KDN), obtain well-calibrated posteriors for both ID and OOD samples, while mostly preserving the classification accuracy and extrapolating beyond the training data to handle OOD inputs appropriately.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源