论文标题

高阶神经添加剂模型:具有特征互动的可解释的机器学习模型

Higher-order Neural Additive Models: An Interpretable Machine Learning Model with Feature Interactions

论文作者

Kim, Minkyu, Choi, Hyun-Soo, Kim, Jinho

论文摘要

深层神经网络等黑盒模型表现出卓越的预测性能,但是理解其行为是困难的。已经提出了许多可解释的人工智能方法来揭示黑匣子模型的决策过程。但是,它们在高风险域中的应用仍然有限。最近提出的神经添加剂模型(NAM)已实现了最先进的可解释的机器学习。与多层感知器相比,NAM可以提供简单的解释,并以轻微的绩效牺牲。但是,NAM只能模型1 $^{\ text {st}} $ - 订购功能交互;因此,它无法捕获输入特征之间的共同关系。为了克服这个问题,我们提出了一种新型的可解释的机器学习方法,称为高阶神经添加剂模型(HONAM)和一种用于高解释性的功能相互作用方法。 Honam可以建模特征交互的任意顺序。因此,它可以提供高风险域需要的高预测性能和解释性。此外,我们提出了一个新颖的隐藏单元,以有效地学习锋利的形状功能。我们使用各种现实世界数据集进行了实验,以检查HONAM的有效性。此外,我们证明了Honam可以通过轻微的表现牺牲来实现公平的AI。 HONAM的源代码可公开使用。

Black-box models, such as deep neural networks, exhibit superior predictive performances, but understanding their behavior is notoriously difficult. Many explainable artificial intelligence methods have been proposed to reveal the decision-making processes of black box models. However, their applications in high-stakes domains remain limited. Recently proposed neural additive models (NAM) have achieved state-of-the-art interpretable machine learning. NAM can provide straightforward interpretations with slight performance sacrifices compared with multi-layer perceptron. However, NAM can only model 1$^{\text{st}}$-order feature interactions; thus, it cannot capture the co-relationships between input features. To overcome this problem, we propose a novel interpretable machine learning method called higher-order neural additive models (HONAM) and a feature interaction method for high interpretability. HONAM can model arbitrary orders of feature interactions. Therefore, it can provide the high predictive performance and interpretability that high-stakes domains need. In addition, we propose a novel hidden unit to effectively learn sharp-shape functions. We conducted experiments using various real-world datasets to examine the effectiveness of HONAM. Furthermore, we demonstrate that HONAM can achieve fair AI with a slight performance sacrifice. The source code for HONAM is publicly available.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源