论文标题

删除稳健的亚次化最大化对矩阵的最大化

Deletion Robust Submodular Maximization over Matroids

论文作者

Dütting, Paul, Fusco, Federico, Lattanzi, Silvio, Norouzi-Fard, Ashkan, Zadimoghaddam, Morteza

论文摘要

最大化单调的下函数是机器学习的基本任务。在本文中,我们研究了经典的Matroid限制下的删除鲁棒性版本。在这里,目标是提取数据集的小尺寸摘要,即使在对手删除了一些元素之后,该数据集包含高价值独立集。我们提出恒定因子近似算法,其空间复杂性取决于矩阵的等级$ k $和已删除元素的数字$ d $。在集中式设置中,我们提供$(3.582 + O(\ Varepsilon))$ - 近似算法,摘要大小$ O(k + \ frac {d \ log k} {\ varepsilon^2})$。在流设置中,我们提供$(5.582 + o(\ varepsilon))$ - 近似算法,具有摘要大小和内存$ O(k + \ frac {d \ log k} {\ varepsilon^2})$。我们通过深入的实验分析来补充理论结果,以显示我们在现实世界数据集上算法的有效性。

Maximizing a monotone submodular function is a fundamental task in machine learning. In this paper, we study the deletion robust version of the problem under the classic matroids constraint. Here the goal is to extract a small size summary of the dataset that contains a high value independent set even after an adversary deleted some elements. We present constant-factor approximation algorithms, whose space complexity depends on the rank $k$ of the matroid and the number $d$ of deleted elements. In the centralized setting we present a $(3.582+O(\varepsilon))$-approximation algorithm with summary size $O(k + \frac{d \log k}{\varepsilon^2})$. In the streaming setting we provide a $(5.582+O(\varepsilon))$-approximation algorithm with summary size and memory $O(k + \frac{d \log k}{\varepsilon^2})$. We complement our theoretical results with an in-depth experimental analysis showing the effectiveness of our algorithms on real-world datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源