论文标题

功能忘记在持续表示学习中

Feature Forgetting in Continual Representation Learning

论文作者

Zhang, Xiao, Dou, Dejing, Wu, Ji

论文摘要

在持续和终身学习中,良好的表示学习可以帮助提高性能并在学习新任务时降低样本复杂性。有证据表明,即使在纯粹的持续学习中,表征也不会遭受“灾难性遗忘”的困扰,但是关于其特征的事实鲜为人知。在本文中,我们旨在对持续学习中的表示形式学习有更多的了解,尤其是在功能忘记问题上。我们设计了一个协议来评估持续学习中的表示,然后使用它来概述持续表示学习的基本趋势,显示其一致的缺陷和潜在问题。为了研究功能忘记问题,我们创建了一个合成数据集,以识别和可视化神经网络中功能忘记的普遍性。最后,我们提出了一种使用门控衔接子来减轻功能忘记的简单技术。我们通过讨论改进表示学习的结论是,在持续学习中有利于旧任务和新任务。

In continual and lifelong learning, good representation learning can help increase performance and reduce sample complexity when learning new tasks. There is evidence that representations do not suffer from "catastrophic forgetting" even in plain continual learning, but little further fact is known about its characteristics. In this paper, we aim to gain more understanding about representation learning in continual learning, especially on the feature forgetting problem. We devise a protocol for evaluating representation in continual learning, and then use it to present an overview of the basic trends of continual representation learning, showing its consistent deficiency and potential issues. To study the feature forgetting problem, we create a synthetic dataset to identify and visualize the prevalence of feature forgetting in neural networks. Finally, we propose a simple technique using gating adapters to mitigate feature forgetting. We conclude by discussing that improving representation learning benefits both old and new tasks in continual learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源