论文标题
促进简单的学习者
Boosting Simple Learners
论文作者
论文摘要
Boosting是一种著名的机器学习方法,它基于将弱和适度不准确的假设与强大而准确的假设相结合的想法。我们研究了弱假设属于界限能力类别的假设。这个假设的灵感来自共同的惯例,即弱假设是“易于学习的类别”中的“人数规则”。 (Schapire和Freund〜 '12,Shalev-Shwartz和Ben-David '14。)正式,我们假设弱假设类别具有有界的VC维度。我们关注两个主要问题:(i)甲骨文的复杂性:产生准确的假设需要多少个弱假设?我们设计了一种新颖的提升算法,并证明它绕过了弗洛德和沙皮尔('95,'12)的经典下限。虽然下限表明$ω({1}/{γ^2})$弱假设有时是必要的,但我们的新方法仅需要$ \ tilde {o}({1}/γ)$弱假设,前提是它们属于边界的VC vc Dismension。与以前的增强算法以多数票汇总了弱假设的算法不同,新的增强算法使用了更复杂(“更深”)的聚合规则。我们通过表明复杂的聚合规则实际上是为了规避上述下限所必需的,对此结果进行了补充。 (ii)表现力:通过增加有限的VC类的弱假设可以学习哪些任务?可以学到“遥远”的复杂概念吗?为了回答第一个问题,我们{介绍组合几何参数,这些参数捕获了提升的表现力。}作为推论,我们为研究良好的类别的第二个问题提供了肯定的答案,包括半个空间和决策树桩。一路上,我们通过差异理论建立并利用联系。
Boosting is a celebrated machine learning approach which is based on the idea of combining weak and moderately inaccurate hypotheses to a strong and accurate one. We study boosting under the assumption that the weak hypotheses belong to a class of bounded capacity. This assumption is inspired by the common convention that weak hypotheses are "rules-of-thumbs" from an "easy-to-learn class". (Schapire and Freund~'12, Shalev-Shwartz and Ben-David '14.) Formally, we assume the class of weak hypotheses has a bounded VC dimension. We focus on two main questions: (i) Oracle Complexity: How many weak hypotheses are needed to produce an accurate hypothesis? We design a novel boosting algorithm and demonstrate that it circumvents a classical lower bound by Freund and Schapire ('95, '12). Whereas the lower bound shows that $Ω({1}/{γ^2})$ weak hypotheses with $γ$-margin are sometimes necessary, our new method requires only $\tilde{O}({1}/γ)$ weak hypothesis, provided that they belong to a class of bounded VC dimension. Unlike previous boosting algorithms which aggregate the weak hypotheses by majority votes, the new boosting algorithm uses more complex ("deeper") aggregation rules. We complement this result by showing that complex aggregation rules are in fact necessary to circumvent the aforementioned lower bound. (ii) Expressivity: Which tasks can be learned by boosting weak hypotheses from a bounded VC class? Can complex concepts that are "far away" from the class be learned? Towards answering the first question we {introduce combinatorial-geometric parameters which capture expressivity in boosting.} As a corollary we provide an affirmative answer to the second question for well-studied classes, including half-spaces and decision stumps. Along the way, we establish and exploit connections with Discrepancy Theory.