论文标题
元学习的元学习一级分类
Meta Learning for Few-Shot One-class Classification
论文作者
论文摘要
我们提出了一种可以在目标类中只有少量示例,而其他示例的方法可以执行一级分类的方法。我们使用所选算法的分类损失来学习特征表示形式,将学习有意义的特征学习作为一个元学习问题,其中元训练阶段反复模拟一级分类。要学习这些表示形式,我们仅需要来自类似任务的多类数据。我们展示了如何将支持向量数据描述方法与我们的方法一起使用,并且还基于原型网络提出了一个更简单的变体,该变体获得了可比性的性能,这表明直接来自数据的学习特征表示可能比我们选择哪种一级算法更重要。我们通过将少量射击分类数据集调整为几个单级分类方案来验证我们的方法,从而获得与传统一级分类的最新结果相似的结果,并改善了在几个摄影设置中使用的一级分类基线的结果。我们的代码可从https://github.com/gdahia/meta_occ获得
We propose a method that can perform one-class classification given only a small number of examples from the target class and none from the others. We formulate the learning of meaningful features for one-class classification as a meta-learning problem in which the meta-training stage repeatedly simulates one-class classification, using the classification loss of the chosen algorithm to learn a feature representation. To learn these representations, we require only multiclass data from similar tasks. We show how the Support Vector Data Description method can be used with our method, and also propose a simpler variant based on Prototypical Networks that obtains comparable performance, indicating that learning feature representations directly from data may be more important than which one-class algorithm we choose. We validate our approach by adapting few-shot classification datasets to the few-shot one-class classification scenario, obtaining similar results to the state-of-the-art of traditional one-class classification, and that improves upon that of one-class classification baselines employed in the few-shot setting. Our code is available at https://github.com/gdahia/meta_occ