论文标题
菲特里:无示例性课程学习的特征翻译
FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning
论文作者
论文摘要
由于灾难性遗忘的负面影响,无示例性的课堂学习学习非常具有挑战性。为了获得过去和新班级的良好准确性,需要稳定性和可塑性之间的平衡。现有的无示例性类信息方法集中于模型的连续微调,从而有利于可塑性,或者在初始增量状态后使用固定的特征提取器,从而有利于稳定性。我们介绍了一种结合固定特征提取器和伪功能生成器的方法,以提高稳定性平衡。发电机使用新类特征的简单而有效的几何翻译来创建过去类的表示形式,该类别由伪用力制成。功能的翻译仅需要存储过去类的质心代表来产生其伪功能。过去类的新课程和伪特征的实际功能被馈入线性分类器,该分类器会逐步训练以区分所有类。与更新整个深层模型的主流方法相比,提出的方法的增量过程要快得多。实验是使用三个具有挑战性的数据集和不同的增量设置进行的。与十种现有方法的比较表明,在大多数情况下,我们的方法优于其他方法。
Exemplar-free class-incremental learning is very challenging due to the negative effect of catastrophic forgetting. A balance between stability and plasticity of the incremental process is needed in order to obtain good accuracy for past as well as new classes. Existing exemplar-free class-incremental methods focus either on successive fine tuning of the model, thus favoring plasticity, or on using a feature extractor fixed after the initial incremental state, thus favoring stability. We introduce a method which combines a fixed feature extractor and a pseudo-features generator to improve the stability-plasticity balance. The generator uses a simple yet effective geometric translation of new class features to create representations of past classes, made of pseudo-features. The translation of features only requires the storage of the centroid representations of past classes to produce their pseudo-features. Actual features of new classes and pseudo-features of past classes are fed into a linear classifier which is trained incrementally to discriminate between all classes. The incremental process is much faster with the proposed method compared to mainstream ones which update the entire deep model. Experiments are performed with three challenging datasets, and different incremental settings. A comparison with ten existing methods shows that our method outperforms the others in most cases.