论文标题

BI-MAML:元学习平衡的增量方法

BI-MAML: Balanced Incremental Approach for Meta Learning

论文作者

Zheng, Yang, Xiang, Jinlin, Su, Kun, Shlizerman, Eli

论文摘要

我们提出了一种用于学习多个任务的新型平衡增量模型不可知的元学习系统(BI-MAML)。我们的方法实现了元更新规则,以逐步将其模型改编为新任务,而无需忘记旧任务。在当前最新的MAML方法中,这种能力是不可能的。但是,这些方法有效地适应了新任务,但是,“灾难性遗忘”现象受到了困扰,在这种现象中,将新任务流入模型中的新任务使模型的性能降低了以前学习的任务。我们的系统只需几次射击就可以执行元更新,并且可以成功完成它们。我们实现这一目标的关键思想是基准模型平衡学习策略的设计。该策略将基线模型设定为在各种任务上同样表现良好,并结合了时间效率。平衡的学习策略使BI-MAML在现有任务的分类准确性方面均超过其他最先进的模型,并且还可以有效地适应对类似的新任务,而所需的镜头较少。我们通过在两个通用基准数据集上进行具有多个图像分类任务的常见基准数据集进行比较来评估BI-MAML。 BI-MAML性能在准确性和效率方面都具有优势。

We present a novel Balanced Incremental Model Agnostic Meta Learning system (BI-MAML) for learning multiple tasks. Our method implements a meta-update rule to incrementally adapt its model to new tasks without forgetting old tasks. Such a capability is not possible in current state-of-the-art MAML approaches. These methods effectively adapt to new tasks, however, suffer from 'catastrophic forgetting' phenomena, in which new tasks that are streamed into the model degrade the performance of the model on previously learned tasks. Our system performs the meta-updates with only a few-shots and can successfully accomplish them. Our key idea for achieving this is the design of balanced learning strategy for the baseline model. The strategy sets the baseline model to perform equally well on various tasks and incorporates time efficiency. The balanced learning strategy enables BI-MAML to both outperform other state-of-the-art models in terms of classification accuracy for existing tasks and also accomplish efficient adaption to similar new tasks with less required shots. We evaluate BI-MAML by conducting comparisons on two common benchmark datasets with multiple number of image classification tasks. BI-MAML performance demonstrates advantages in both accuracy and efficiency.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源