论文标题

关于功能线性模型的假设转移学习

On Hypothesis Transfer Learning of Functional Linear Models

论文作者

Lin, Haotian, Reimherr, Matthew

论文摘要

我们研究了在重现内核希尔伯特空间(RKHS)框架下的功能线性回归(FLR)的转移学习(TL),观察到现有高维线性回归中的TL技术与截断的FLR方法不兼容,因为该方法是基于tluncation flr方法,因为功能无限地构成了无限型的处理,因此可以通过实质性数据进行处理。我们使用RKHS距离来测量任务之间的相似性,从而允许传递的信息类型与施加的RKHS的属性相关。在假设偏移转移学习范式的基础上,提出了两种算法:一个算法是在已知积极来源时进行转移,而另一个利用聚合技术可以实现强大的转移,而无需先前有关这些来源的信息。我们为这个学习问题建立了渐近下限,并表明所提出的算法享有匹配的上限。这些分析提供了对有助于转移动态的因素的统计见解。我们还将结果扩展到功能性通用线性模型。通过广泛的合成数据以及现实世界的数据应用程序证明了所提出算法的有效性。

We study the transfer learning (TL) for the functional linear regression (FLR) under the Reproducing Kernel Hilbert Space (RKHS) framework, observing that the TL techniques in existing high-dimensional linear regression are not compatible with the truncation-based FLR methods, as functional data are intrinsically infinite-dimensional and generated by smooth underlying processes. We measure the similarity across tasks using RKHS distance, allowing the type of information being transferred to be tied to the properties of the imposed RKHS. Building on the hypothesis offset transfer learning paradigm, two algorithms are proposed: one conducts the transfer when positive sources are known, while the other leverages aggregation techniques to achieve robust transfer without prior information about the sources. We establish asymptotic lower bounds for this learning problem and show that the proposed algorithms enjoy a matching upper bound. These analyses provide statistical insights into factors that contribute to the dynamics of the transfer. We also extend the results to functional generalized linear models. The effectiveness of the proposed algorithms is demonstrated via extensive synthetic data as well as real-world data applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源