论文标题
通过$ \ ell_1 $正规化转移学习
Transfer Learning via $\ell_1$ Regularization
论文作者
论文摘要
机器学习算法通常需要在固定环境下大量数据。但是,在许多现实世界中,环境是非平稳的。关键问题在于如何在不断变化的环境下有效调整模型。我们提出了一种通过$ \ ell_1 $正则化将知识从源域转移到目标域的方法。除了普通的$ \ ell_1 $正则化之外,我们还将$ \ ell_1 $正规化源参数和目标参数之间的差异。因此,我们的方法对估计本身和估计的变化都产生了稀疏性。所提出的方法在固定环境下具有严格的估计误差,并且该估计值与小残留物下的源估计值保持不变。此外,即使由于非平稳性而误认为源估计值,估计值也与基本函数一致。经验结果表明,所提出的方法有效地平衡了稳定性和可塑性。
Machine learning algorithms typically require abundant data under a stationary environment. However, environments are nonstationary in many real-world applications. Critical issues lie in how to effectively adapt models under an ever-changing environment. We propose a method for transferring knowledge from a source domain to a target domain via $\ell_1$ regularization. We incorporate $\ell_1$ regularization of differences between source parameters and target parameters, in addition to an ordinary $\ell_1$ regularization. Hence, our method yields sparsity for both the estimates themselves and changes of the estimates. The proposed method has a tight estimation error bound under a stationary environment, and the estimate remains unchanged from the source estimate under small residuals. Moreover, the estimate is consistent with the underlying function, even when the source estimate is mistaken due to nonstationarity. Empirical results demonstrate that the proposed method effectively balances stability and plasticity.