论文标题

基于内核的梯度下降算法的自适应停止规则

Adaptive Stopping Rule for Kernel-based Gradient Descent Algorithms

论文作者

Chang, Xiangyu, Lin, Shao-Bo

论文摘要

在本文中,我们提出了基于内核的梯度下降(KGD)算法的自适应停止规则。我们介绍了经验有效的维度,以量化KGD中迭代的增量并得出可实施的早期停止策略。我们在学习理论框架中分析了自适应停止规则的性能。使用最近开发的积分运算符方法,我们严格地证明了自适应停止规则的最佳性,以显示配备此规则的KGD的最佳学习率。此外,还提供了配备了提议的早期停止规则的KGD中迭代次数的急剧结合,以证明其计算优势。

In this paper, we propose an adaptive stopping rule for kernel-based gradient descent (KGD) algorithms. We introduce the empirical effective dimension to quantify the increments of iterations in KGD and derive an implementable early stopping strategy. We analyze the performance of the adaptive stopping rule in the framework of learning theory. Using the recently developed integral operator approach, we rigorously prove the optimality of the adaptive stopping rule in terms of showing the optimal learning rates for KGD equipped with this rule. Furthermore, a sharp bound on the number of iterations in KGD equipped with the proposed early stopping rule is also given to demonstrate its computational advantage.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源